impossible for ideas to compete in the marketplace if no forum for
eCOMMERCE - Components of the Internet
Author: Alan Brown
The Internet is changing the way customers, suppliers, and companies interact to conduct business, communicate, and collaborate. The Internet is creating huge opportunities to expand existing businesses, and enabling the creation of completely new businesses unthinkable without the business and technology advances fostered by the onset of the Internet age. As succinctly stated by the U.S. Commerce Secretary William Daley:
"Technology is reshaping this economy and transforming businesses and consumers. This is more than e-commerce, or e-mail, or e-trades, or e-files. It is about the ‘e’ in economic opportunity."
This impact has been confirmed in a recent study conducted by the Economist Intelligence Unit (EIU) of Booz-Allen & Hamilton. They surveyed the opinions of more than 500 senior executives with respect to how the Internet is changing their corporate strategy. The results showed that more than 90 percent believed that the Internet would transform or have a major impact on their corporate strategy within the next three years. Furthermore, many of these executives recognized the need to restructure their businesses to take advantage of fundamental changes in their business environment.
However, with these changes come a number of threats. Many organizations are intimidated by the new technologies, unsure of how to take advantage of them, and wondering how these technologies will align with existing investments in skills and infrastructures. What they require is a conceptual framework for understanding application development in the Internet age, coupled with a realistic view of the technologies that will drive this business revolution.
Components and Component-Based Development (CBD) are the approaches that satisfy these needs. More and more we see organizations turning to components as a way to encapsulate existing functionality, acquire third party solutions, and build new services to support emerging business processes. The latest technologies for distributed systems support and encourage a component view of application integration and deployment. Furthermore, component-based development provides a design paradigm well suited to today’s eclectic application solutions needs. This paper examines current directions in enterprise-scale systems, the features of components and component-based development, and their role in building enterprise-scale applications for the Internet age.
In the past decade the role of application development has changed significantly. The last vestiges of the old Computer-Aided Software Engineering (CASE) days of the 1980’s have all but disappeared. Yet organizations delivering large-scale software solutions for the enterprise still require powerful tools that enable them to manage costs, be productive, reduce time-to-market, and enable maintenance and evolution of those solutions over extended periods of time.
The increasing heterogeneity, complexity, and distributed nature of deployment architectures only serves to compound the problems faced by software solutions providers. One of the major driving forces for these changes over the past decade has been the massive adoption of the Internet as a deployment target for many systems. From its origins as a means to share documents and perform collaborative research in academia, the Internet (and its related technologies of Intranets and Extranets) have become primary targets for highly-interactive systems of all kinds, supporting commercial transactions, information gathering and dissemination, and many forms of education and entertainment.
Today it is clear that the Internet is having a major impact on the way business is conducted. Many new business opportunities are being created. A recently published US. Government report on The Economic and Social Impact of Electronic Commerce estimates business-to-business electronic commerce will grow to over $1 trillion within 3 to 5 years. Additionally, existing business are being transformed. Economic efficiencies of electronic commerce are resulting in lower cost distribution, tighter inventory control, increased productivity, and improved customer service.
Supporting these changes is a wide collection of web-based technologies, spawning a revolution in the way in which systems are designed, deployed, and evolved. Many organizations see the need to take advantage of the Web using these technologies. Many fewer of those organizations have the knowledge, technology, or skills necessary to do so. Arguably, this mismatch between the needs of organizations and their ability to execute on those needs represents the greatest challenge to organizations since the dawn of the computer age more than 30 years ago. Those that have succeeded have been rewarded with more efficient services meeting the needs of a growing customer base, the flexibility to support the move into new markets, and inflated stock evaluations based on their future potential to dominate their chosen domains. Those that have not embraced this technology have at best been sidelined as niche players, or at worst have failed to survive at all.
These challenges faced by organizations assembling software-intensive solutions also provide the greatest opportunity for enterprise application solution vendors. Recognizing the strategic importance of these new technologies to their businesses, many organizations have raised the importance and stature of their Information Technology (IT) departments to levels not seen in the past 20 years. Organizations understand that their ability to compete is substantially limited by their IT departments ability to execute using web-based technologies. Consequently, IT managers are frequently well positioned within the organization to have a voice in strategic decision making, and to obtain funding and support for new ventures. However, with this visibility comes responsibility, and IT managers require partners to help them to succeed. They are looking for an enterprise solution vendor:
Organizations face many pressures as they seek to improve the services they deliver to their markets. By far the most compelling motivation for many organizations is the need to participate in the eBusiness revolution. We examine this need, and briefly review a number of additional business and technology trends within which enterprise-scale software solutions must succeed.
Much has been written about the current changes taking place in the business practices of many organizations as a result of the eBusiness revolution. Taken in its broadest sense eBusiness is the transformation of key business processes through the use of Internet technologies. The hype around this transformation is so extreme that it is essential to ask the question; Is the Internet really revolutionizing the business world?
The answer is an emphatic "yes!". Organizations recognizing the impact of eBusiness are reaping astonishing rewards. Those left behind are in great danger of becoming niche players in an expanding market. What makes the Internet so important is that it enables new kinds of business to be conducted, and completely re-adjusts many of the key market drivers. At lease seven key trends are enabled as a result of these technologies:
All of these trends lead many organizations to believe that eBusiness represents the greatest opportunity (and greatest threat) that they now face. Responding to these challenges requires the support of advanced, innovative software solutions. Creating and evolving these solutions to meet these business needs provides the most important goal of IT departments today.
Additionally, many other business trends must be addressed. As organizations compete in a market economy, those able to provide the best services at the most competitive price are likely to succeed. This requires that an organization understand the business context in which it operates, and enables that business to succeed with the computer systems it uses to support it. As changes occur in the business environment, the computer systems themselves must change to continue to provide appropriate support.
Recently, a number of organizations have faced unprecedented changes to their business driven by at least three significant trends.
First, a number of industries have recently faced changes in government policies and practices resulting in major changes to the business practices in those industries. For example, in the United States the banking, insurance, telecommunications, and power delivery industries have all recently faced government deregulation at a state or federal level. These have typically had a major impact on business practices.
Second, the business community has been faced with an increasing number of acquisitions, mergers, and takeovers of organizations. While each such event results in major changes to the business practices within the organizations involved, there are also frequently changes required by a large number of organizations who partner, trade, supply, or compete with those organizations. This can impact tens if not hundreds of organizations, particularly for some of the larger mergers in the banking, defense electronics, and telecommunications industries.
Third, a number of major future events are causing many organizations to review and upgrade their computer systems. The most publicized global event is the Year 2000 (Y2K) problem that faces many computer systems as we reach the end of the millennium. To deal with Y2K, many computer systems must be examined and suitable changes put in place. Similarly, other events such as the European Monetary Union (EMU) are being faced by specific industries, or in particular regions of the world. Each such event requires changes to be made to mission-critical systems.
Recently there have been a number of important advances in computer-based technologies that have made the software industry rethink how software is developed, and offer new opportunities with respect to computer-based support for reuse of software artifacts. The impact of these advances is directly affecting everyone in the software industry. Three of these advances are of particular note.
First, the rapid evolution of hardware technologies has continued for more than a decade. The result has been a continuing improvement in the price/performance ratio of computer technologies. Organizations now typically have vastly more computing power than a few years ago, embodied in a large number of personal computers distributed throughout all levels of the organization.
Second, distributed access to remote information is now less expensive to develop, less cumbersome to maintain, and more user-friendly and responsive. This is a consequence of a number of advances in distributed infrastructure technologies supporting client/server architectures, high throughput networks, and distributed data management. Many distributed infrastructure technologies are now commonplace, supporting a collection of underlying protocols and standards that includes transmission control protocol/internet protocol (TCP/IP), remote procedure call (RPC), distributed computing environment (DCE), and the Common Object Request Broker Architecture (CORBA).
Third, unbounded excitement in the World Wide Web, Internet, and Intranet technologies has changed the way people think about information access and availability. This has led to many new tools, processes, techniques, and technologies to support this new way of thinking and working. What an end user expects from an application is quite different now than it was only a few years ago. In fact, the World Wide Web technologies are having a deep impact on many aspects of application development.
As a consequence of these business and technology drivers, IT resources today are consumed by three basic kinds of activities:
As organizations take part in these activities, they face at least three critical issues.
Issue 1 – Tying Together Many Disparate Services
How can users integrate a web-based front end into a legacy system back end – and then manage and evolve it?
Many organizations are faced with challenging software and hardware environments that have evolved over a number of years. Meeting new business needs means tying together existing systems with new web-based front ends. We illustrate this with an example shown in Figure 1.
Figure 1. A Simplified Example of an E-Commerce Application
Typical E-commerce applications require complex collections of new and existing technologies to be integrated to provide the complete solution. Customers interact with browsers across the Internet, but require access to a variety of back-end systems to carry out their business. This requires new business logic to be developed, transactions to be executed on the back-end systems, reformatting and translation of the data return, presentation of the data in meaningful ways to the user, and management of the user’s interaction with all these systems across a single session. Making the task yet more complex is the fact that this system must deal with all of the scalability, security, and response-time issues typical of any large-scale application.
Issue 2 – Visualizing and Managing Integration and Deployment
How can users substantially reduce the effort required to integrate the next bought, wrapped, or built application?
Many organizations have spent a great deal of time and effort over the past few years to purchase, install, and deploy packaged applications. The most obvious examples of this are the range of Enterprise Resource Planning (ERP) packages from SAP, Baan, and PeopleSoft. The goal of these outsourcing efforts was to reduce internal IT costs by delegating the responsibility for large pieces of the organization’s back office function to these third party solutions. The expectation was that this would reduce maintenance costs and free IT staff to concentrate their efforts on value-added services for the organization.
Figure 2. eBusiness is about Integration
For many organizations taking this approach this goal has not been met. They soon realized that while the cost of purchasing the package was less than the estimated cost of developing the functionality, a large hidden cost was involved in customizing the package to meet the organization’s needs. In some published accounts the cost of customization can rise to over 80% of the total cost of a package replacement project. It appears that much of this effort is involved with understanding the organization’s current business practices, aligning those business practices with the supported processes offered by the purchased package, and tailoring the package for maximum operational efficiency within the organizational context. As illustrated in Figure 2, eBusiness is about integration of existing systems as much as it is about writing web-based client interfaces.
Issue 3 – Managing the Infrastructure for Future Applications
How can users create an application services architecture that adapts quickly enough to keep up with the rapid evolution of Web-based and distributed systems architectures?
With the rapid pace of change of web-based technologies, many organizations find themselves in a classic dilemma. Which technologies do they decide to adopt and when? If they wait too long for the market to decide on an obvious long-term technology direction, then they will loose customers and market share to their more advanced competitors. If the choose too soon they risk selecting a dead-end technology that will result in wasted effort, expensive maintenance, and delays while the systems are rewritten.
An approach taken by most organizations is to attempt to define a long term strategy based on a set of core web-technologies. This provides the backbone for their future distributed systems, and provides the scope within which other decisions on technology can be made. Unfortunately, defining this strategy is not easy. To be successful, software architects must select from among a myriad of technical choices, and must navigate the many twists and turns of the current web-technology marketplace.
Additionally, the match between business needs and technical solution is key. Many examples exist of wonderful technical solutions being created to solve the wrong business problem! Similarly, many business solutions solve the right business problem, but at the wrong time. The needs have already evolved by the time the system is ready for deployment. To be successful requires a deep understanding the business, and flexibility of the solution to allow it to evolve as the business needs change.
Addressing these issues requires a new approach to enterprise-scale application development, an approach that recognizes the needs of enterprise-scale systems solutions in the Internet age. This perspective concentrates of three key activities:
These activities are driven by, and must respond to, the needs of a variety of web-based technologies. Furthermore, the assumption in the Internet age is that the deployment platform consists of a web-based solution appropriate for the enterprise.
Enterprise-scale applications are inherently large, complex, and distributed. To be effective they must be deployed to architectures that support these needs. Many choices face the designer in defining the architecture of distributed systems.
Because organizations operate in a decentralized fashion, or do business in many geographic regions, they typically require distributed computing support. Initially this was achieved through remote terminals connected over expensive proprietary lines to centralized mainframe computers. However, application systems in the 1980s began to make a move from mainframe-based to client/server architectures. While this move has been implemented in many ways, the approach is characterized by data-intensive applications that were re-engineered by separating remote data-intensive services from local desktop display functions. The functions that manipulated the data were initially executed on the (mainframe) server, but migrated to the desktop as the performance of desktop machines improved.
This client/server architecture is successful in two particular situations. The first is for data-intensive applications in which most of the business logic can be executed on the server manipulating the data that resides there. The processing takes place on the server, and the results of the processing transmitted to the client for presentation to the user. This leads to the notion of a "thin client" solution. The second is when there is a high bandwidth connection between clients and servers (such as a proprietary local area network). In this case large amounts of data can travel between clients and servers without significant impact on the performance of the system. This supports either a "thin client’ or a "fat client" solution.
However, there are limitations with the client/server architecture in many kinds of common development situations. In particular, where significant process-oriented business logic is required this approach breaks down. The cause of the breakdown is the inability to adequately address the issue of where to execute this business logic. If it executes on the client, then a number of distribution issues must be addressed that inhibit performance and evolution of the system for large numbers of heterogeneous clients. If it executes on the server it reduces flexibility, introduces a potential performance bottleneck, and too closely ties the business logic to the particular server technology used.
The answer is to introduce one or more intermediate tiers to manage and execute the business logic. These n-tier solutions introduce additional management challenges of the multi-tiers, but solve many issues with respect to performance, flexibility, and evolution of the systems. N-tier architectures are now the de facto way to architect robust, high-performance, enterprise-scale distributed systems.
In the 1990s the importance of the web as a deployment platform for distributed systems has introduced a number of challenges with respect to N-tier distributed systems. In particular, with the advent of the Internet and web technologies, system designers have had to reevaluate the applicability of N-tier architectures, and assess which technologies are appropriate at each tier. At the client tier, for example, end users of web-based systems have introduced two major changes in the architecture of deployed applications:
Furthermore, the performance characteristic of the Internet as a distributed infrastructure for web-based applications encourages a thin client style of architecture for most forms of data-intensive applications. The result of these requirements is the web-based systems architecture in which the web server plays the role of the middle tier. This architecture is illustrated in Figure 3.
Figure 3. A Web Server Application
A browser running on the client displays information by interpreting pages containing commands in HyperText Markup Language (HTML). A request from a browser uses the HyperText Transport Protocol (HTTP) to transmit the request, with a Universal Resource Locator (URL) identifying the web server to respond to the request, and the source of the action to be taken. The appropriate web server receives the request and acts upon it. In the simplest case the URL simply identifies another page of HTML to be returned to the browser for display.
In web-based applications the request often requires some more complex processing to take place. Typically this requires static text and formatting together with the execution of some business logic to create dynamic content for display. This dynamic content means that the information returned to the user is specialized for the request being made, to the identity of the user making the request, or uses information obtained from previous steps in the current user session. To create the dynamic content, interaction with external systems or databases may be required using calls to appropriate Application Programming Interfaces (APIs) or database interfaces. In each case the web server programs are scripts containing a combination of business logic, presentation logic, data access logic, and integration logic. This business logic can be implemented in a number of ways, including Active Server Pages (ASPs), Java Servlets, and Java Server Pages (JSPs).
Many useful web-based applications have been developed using this kind of architecture. A number of the most successful sites on the Internet for conducting e-business have deployed some version of this architecture (e.g., amazon.com). However, a number of limitations with this approach are apparent:
The response to this limitation is to separate the user interface-focused processing from the application-focused processing. The former takes place on the web server, while the notion of an Application Server is introduced to manage the application-focused processing.
The response to this limitation is the concept of Enterprise Application Integration (EAI). This offers a standardized way to connect various kinds of existing systems within a web-based application using common connectors and integration approaches.
The response to this limitation is to consider each element in the system to be a component offering its services through well-defined interfaces. The application server supports a component model to allow components to be assembled by connecting the services through their interfaces to offer new business functionality.
Consequently, to build robust, scaleable, web-based applications for the enterprise these three elements must be addressed: Application Servers, Enterprise Application Integration, and Components. We now examine each of these in more detail.
To simplify the development, management, and evolution of web-based systems, the middle tiers have now come to be viewed as logically consisting of two distinct pieces. As illustrated in Figure 4, these two pieces are the web server and the application server.
Figure 4. The Web Server and Application Server in the Middle Tier
As shown in Figure 4, one piece of the middle tier consists of the web server itself, responsible for receiving requests from the clients, parsing those requests, and the generation of the graphical user interface (GUI) for transmission back to the clients. Note that while in theory the web server may be considered simply to be handling the user interface, there are in practice many additional tasks that it will perform. These include:
Hence, while often dismissed as simply user interface management, this collection of tasks makes development of the web server piece of the applications complex to develop, and difficult to evolve.
The second piece of the middle tier is the application server. This part of the middle tier is where the business logic itself is executed. It is maintained independently from the web server to ensure that the presentation and user interaction aspects of the application are separated. This allows different teams to specialize in developing the business logic and developing the user interface aspects. It also allows for multiple presentations of the same business logic (for example, when there are different physical devices using the same application).
Currently, there is much confusion about what an application server actually is. Perhaps what we can agree on is that an application server is a standardized set of frameworks and servers on which to build enterprise-scale applications. Because of the broadness of the definition, articles on application servers may discuss some or all aspects of hardware, network connectivity, middleware technology, and software supporting the execution of applications on this platform.
An illustration of the capabilities of the application server is shown in Figure 5. This is a logical view of the application server’s capabilities. Note that it is shown as the key enabling technology for tying back-end data-intensive applications to a variety of clients. As a result, many different vendors frequently describe their products as application server solutions. This includes well-known products such as Microsoft’s MTS, the Object Management Group’s (OMG’s) CORBA, and BEA’s Tuxedo.
Figure 5. A Logical View of an Application Server
In Figure 5 we see that the application server can provide a wide range of support for enterprise-scale systems. In particular, the application server provides a core set of services for coordinating transactions, security, and data management. As a result, an important part of the application server is its ability to handle transactions and deal with asynchronous communication across the pieces of a distributed system. This may be with application server-specific technologies, or via integration with existing transaction processing or message-oriented middleware solutions.
Different kinds of application server technology are now available from a variety of vendors. Each solution has its own advantages and limitations. For example, in some cases they favor a particular kind of middleware, and offer a rich set of services targeting that particular middleware. In other cases, they are aimed specifically at Internet-based applications, and provide robust web server capabilities for supporting e-commerce applications. As a result, the choice of application server technology is a critical one for most organizations, with many competing needs to be weighed.
The application server is simply one piece of the solution to a much broader problem facing many organizations. In solving their IT problems, many organizations find that now their IT infrastructure consists of a puzzling collection of packaged applications, deployed mainframe systems, plus a variety of homegrown corporate and departmental utilities. Although they’ve solved many of their near-term IT needs, they are also finding that they face a growing problem with a key corporate IT need: connecting these solutions for a consistent view of their IT assets. One answer is to combine these isolated solutions to provide increased productivity and business intelligence to corporations. The broad term Enterprise Application Integration (EAI) covers the various aspects of this approach.
Consequently, if you open any magazine on application development, it’s likely that you will see at least one article on Application Integration, or EAI. However, as with most new approaches, the concept of EAI is a mixture of existing technologies and new perspectives, wrapped up with a large dose of marketing hype. As a result, it is important to look at a number of broad aspects of the EAI approach and their impact on enterprise application development as a whole.
Arguably, today there is as much new code being written as there has ever been. Yet looking at current magazines and journals it would be difficult to reach this conclusion. This is because there is a subtle perceptual change of IT organizations’ role. Rather than "a developer of systems", today’s IT organization is viewed as an "information solutions provider" in support of core business functions. These solutions provide critical business value to the corporation.
Seen this way, the IT organization takes a different perspective on how it provides solutions. Certainly there is application development to be carried out. However, in offering solutions, many IT organizations look to out-source solutions via packaged applications, or try to wrap parts of existing systems and make those services available for use in new ways. This reuse of existing assets leads to an integration-centric view of application provisioning. The IT organization must find ways to deliver business value with approaches that allow them to:
As a result, many IT organizations see EAI as a critical aspect of their role—integrating existing services to provide enterprise-scale solutions.
Building new applications that integrate with existing systems has always been a key development challenge. To help this, most large systems and packages provide Application Programming Interfaces (APIs) to expose their services and data for third part access. Typically, new applications make calls through these APIs to access and retrieve data from the existing systems.
The additional challenge facing EAI is that many different APIs exist for a variety of existing systems. What is required is a consistent way to connect to a variety of existing systems. To assist with this, a number of EAI connector solutions have been developed. Initially, these connectors provided an intermediate proprietary representation, a set of translators into this intermediate representation from many of the common packages (e.g., the ERP packages from SAP, Baan, and PeopleSoft), and a toolkit for developing custom translators for other packages and systems. Use of these connectors helps to greatly reduce the effort required to integrate a collection of systems.
Proprietary connector-based solutions are now available from a number of vendors, and are used quite extensively in assembling new systems making use of existing services. Currently, the two most popular products are IBM’s MQ Integrator and TSI’s Mercator. Other products in use include NEON, Oberon, and Vitria. Unfortunately, the proprietary nature of the solutions does introduce a significant limitation to their use. To address this many of the EAI connector vendors are moving towards adopting a standard connector approach based on the eXtensible Markup Language (XML). XML is used as the standard intermediate language for transmission of information to and from the existing systems. This approach has advantages because:
In addition, package vendors themselves are currently moving toward the use of XML as the standard technology for import and export of data from their packages. This should reduce the effort and improve the consistency of EAI solutions using a connector approach.
However, it’s important to recognize that there are many aspects to providing EAI solutions. Unfortunately, many of the existing articles and products have focused only on connector-based solutions for enterprise resource planning (ERP) packages, such as those offered by SAP, Baan, PeopleSoft, and others. This is one of many misconceptions about EAI.
In fact, we can see that there are a number of approaches to application integration, with connectors being just one of them. As illustrated in Figure 6, we can distinguish 4 distinct approaches to application integration.
Figure 6. Application Integration Approaches
As shown in Figure 6, four approaches to application integration are:
EAI is about using and reusing existing assets to provide enterprise-scale solutions, typically deployed to the Internet. This perspective requires much more of a solution than simply a set of connectors. It requires the capability to understand and describe the business problem being addressed, architecting an appropriate solution meeting those needs, and provisioning that solution through a combination of existing packages and systems integrated with newly developed software.
If we take this broader view of EAI, it leads us to the following conclusions:
We conclude that true EAI needs architected solutions allowing the combination of many different kinds of existing system assets with newly developed code. These systems must be deployable to today’s distributed infrastructure platforms, and managed and maintained as important business solutions for the long-term benefit of the corporation.
For most developers, the Internet is a cost-effective way to distribute their applications. To handle the problem of accessing modules of a distributed application when the location of the specific model cannot be predicted, a standard set of interfaces is required for naming, locating and accessing these modules. It requires a standard way to identify what operations a given module will support, and a standard way to pass messages between the modules. Supporting these needs is the goal of components and component-based approaches to software development. Thus, a distributed system, once it becomes larger than a small application, must be component-based. Consequently, any serious web-based application development product must be based on a component standard.
As a result, when organizations start to build distributed web-based applications, they tend to move to components and to an n-tiered model that places business logic in components that reside on the middle tiers. Over the past few years this approach has been widely adopted by end-user organizations, and is supported by key standards from Microsoft through its COM+ initiative, Sun/IBM through its Enterprise Java Beans (EJB) specification, and the Object Management Group (OMG) with its efforts to standardize an open server component model based on the Common Object Request Broker Architecture (CORBA).
However, a component-based approach is more than simply deploying components to the middle tiers of an N-tier architecture. The application server is a platform for hosting distributed systems, integrating different kinds of assets across the corporation. To make use of this distributed systems platform requires:
The best approaches to achieve this are based on component technology.
Component approaches concentrate development efforts on defining interfaces to pieces of a system, and describing an application as the collaborations that occur among those interfaces. The interface is the focal point for all analysis and design activities, which allows the implementations of the components to be independent of the way the components are accessed and used. As a result, the components can be provisioned, using a variety of technologies, including via wrapping some existing code or data, using the application programming interface (API) of a purchased package, or writing new purpose-built logic in an application development tool.
Figure 7. Example Component Architecture Diagrams.
In Figure 7, we illustrate the kinds of component architecture diagrams important in understanding how a distributed system is designed and deployed. The figure shows that different kinds of component architecture are needed, including conceptual, implementation, and deployment architectures.
The role of components and component design approaches is more important when we consider Internet-based systems. As we have discussed, organizations building distributed web-based applications will tend to develop an N-tier architecture. Components and component technologies provide the enabling approaches for developing these applications in a robust, repeatable way based on industry standards. Because of the importance of components to web-based applications, many of the available application server technologies have become specialized toward supporting one of the many component standards that have been developed.
The component model defines a set of standards for designing, assembling, and deploying applications from pieces. It describes how each component must make its services available to others, how to connect one component to another, which common utility services can be assumed to be provided by the infrastructure supporting the component model, how new components announce their availability to others, and so on. In short, by supporting a particular component model the application server is providing a programming approach for developers to use that application server.
Three particular component models currently dominate the market: COM+, CORBA, and Java/EJB. These are summarized in Table 1.
Table 1. A Comparison of COM+, CORBA, and EJB from an EAI Perspective
Microsoft is positioning Windows 2000 as an application server platform based on its support for COM+, a combination of the COM object model, the MTS transaction manager, and a programming model around these supported by events, asynchronous messaging, dynamic load balancing, and life-cycle management services.
The Object Management Group (OMG) has been developing standards for distributed systems for a number of years. It has been most successful with its CORBA standard, and a number of implementation of that standard. It is now about to release a component model for CORBA, heavily influenced by the Java/EJB standard.
Sun recently released the first version of its server-side standard for Java, Enterprise Java Beans (EJB). More than 25 companies have already announced that they intend to support the EJB specification. This provides one of the final pieces for a complete end-to-end story for the use of Java as an enterprise application development language.
The enterprise application development world is changing fast. Very few people could have predicted the massive changes that have occurred over the past 5 years in the business environment and technical landscape. In particular, the importance of the Internet and its related technologies has taken everyone by surprise. As a result of these changes, end users of enterprise-scale information systems have significantly different expectations about their flexibility, availability, and usability. These translate in to a whole new set of demands on IT departments and their supporting technologies.
How can IT professionals meet these needs? The answer appears to come from three important trends:
These are the critical elements of future enterprise-scale application solutions in the Internet age.
Nothing you read in The Business Forum Journal should ever be construed to be the opinion of, statements condoned by, or advice from, The Business Forum Institute, its staff, workers, officers, members, directors, sponsors or shareholders. We pass no opinion whatsoever on the content of what we publish, nor do we accept any responsibility for the claims, or any of the statements made, within anything published herein. We merely aim to provide an academic forum and an information sourcing vehicle for the benefit of the business and the academic communities of the Pacific States of America and the World. Therefore, readers must always determine for themselves where the statistics, comments, statements and advice that are published herein are gained from and act, or not act, upon such entirely and always at their own risk. We accept absolutely no liability whatsoever, nor take any responsibility for what anyone does, or does not do, based upon what is published herein, or information gained through the use of links to other web sites included herein. Please refer to our: legal disclaimer