Managers, micros and mainframes

Managers, micros and mainframes

Conference Report Managers, Micros and Mainframes The 1985 New York University Symposium was devoted to the theme “Integrating Systems for End Users...

694KB Sizes 3 Downloads 28 Views

Conference

Report

Managers, Micros and Mainframes The 1985 New York University Symposium was devoted to the theme “Integrating Systems for End Users - Managers, Micros and Mainframes”. It was held in New York City from May 22-24, 1985 and was sponsored by the New York University Graduate School of Business Administration, Computer Applications and Information Systems. Managers, Micros and Mainframes can be seen as the sides of a triangle to which management information systems research need to give more attention, according to the Chairman of the Organizing Committee, Professor Matfhias Jarke. The purpose of the symposium was to provide a forum for researchers and practitioners to address the issue of integrating technology and management information systems in micromainframe computing. Major organisations see this as a crucial task, Jarke said. The three-day symposium focussed on different areas, namely: _ the relationship between Managers and Micros (sessions I and II); - planning strategies for specific technological aspects such as decision support systems, communications design and task distribution strategies among micros and mainframes (sessions III, IV and V); _ future trends in micro-mainframe computing, contrasting the viewpoints of hardware technology, artificial intelligence and management science (session VI). In his key-note address entitled “Office Automation Tools”, D. Tsichiritzis (University of Geneva, Switzerland) presented a lively opening filled with many examples and analogues to clarify his points made in areas such as first and second generation and multimedia tools,

North-Holland Information & Management 0378.7206/85/$3.30

9 (1985) 111-117

0 1985, Elsevier Science Publishers

office procedure specification, models and objects. In addition to the lectures there were two panel discussions: 1. The theme was “Educational Issues in End-User Computing”. A number of issues were addressed, namely the future roles of information-systems professionals and end-users, what they should learn, and what the instructional methods should be in inhouse and university education. The panelists were L. Ball (Babson College, Wellesley, MA, USA), M. Lieberman (Shearson Lehman/American Express Inc.), and M. Weisburgh (Personal Computer Learning Centers of America Inc.). 2. T.K. Bokson (The Rand Corp.), J. Gosden (Equitable Life), and P.C. Semprevivo (Deloitte, Haskins & Sells) were the panelists on the theme “Individual, Organisational and Societal Implications of End User Computing. ” They assessed the implications of

the themes of the conference for executive management and for the management of technology. The panel emphasised the extent to which changes represent significant realignments among organisational stakeholders. We present below a report on the lectures delivered at the conference.

1. Diffusion

of End-user

Computing

Company Experiences

Broadly defined, end-user computing is the use of computer based information systems by anyone outside of the formal data processing or information systems areas. It includes managers and professionals, word processing, time sharing systems and electronic mail. The development of end-user

B.V. (North-Holland)

computing is part of the overall growth and direction of the information systems field toward managerial support applications, according to C. V. Bu/len (Center for Information Systems Research. MIT. Cambridge, MA, USA). She described the background and results of several research projects aimed at trying to better understand the phenomenon of end-user computing for the purposes of providing managerial guidelines and corporate strategies for its management in organisations. Bullen noted that from a strategic viewpoint the plan for end-user computing is incomplete unless it addresses

all its forms.

According area in which

to Bullen, companies

the formulation

the studies show that the seriously need action is in

of a strategy

for end-user

comput-

ing. Lacking such a strategy. companies expand on inappropriate or obsolete hardware and software. This strategy must include a plan to benefit the company as a whole and its creation must be carried out by cooperative partnership of IS and line management. she said. Bullen continued by covering the implications and recommendations coming out of the research studies. She concluded her lecture by touching on the directions for the future in end-user computing, namely the development of interconnection needs. Joining personal computers to time sharing users would mean a solution that provides the power of time sharing access to data bases and the independence of personal computers. If the information cooperate. standing

systems

area

there and

and

are

the

the end-user

mutual

application

community

benefits of

in

the

new

undercom-

puter-based tools. The extent to which companies harness these benefits means the realisation of the true value of end-user computing, Bullen concluded. Orguni~ational

Environmmt

md

Munugemeflt

Poliq End-user computing and the micro-computers which support it pose a challenge to the way organisations think about and use information, and the way managers control the information resource. Managers are now learning that in order to be useful. the PC’s must be connected to other corporate information systems and databases, said K.C. Luudon (Graduate School of Business, New York University, NY. USA).

He reported on a survey of 25 of the largest corporate users of PC’s in the financial industry. The survey focussed on non-data processing use of PC’s in end-user divisions and tions in the way organisations

on observed respond

variato the

opportunities and challenges posed by PCs. He examined descriptive data on the uses users, quality control, management and decision making practices. fessional outlined

Luudon

also reviewed

literature on the conclusions

some

end-user drawn

of the pro-

computing from that.

and After

this, Laudon concerned himself with the development of a stage theory based on PC telecommunication configurations to account for organisational differences in utilization and management. Luudon reported finally that the data supported the notion that linking PC’s into corporate networks leads to more management use. He concluded that in order to bring about this link, PC’s must be actively managed

and decision-making

II. Management DiJ&sion

of End-User

more

central&d.

Computing

of End- User Computincg

Technologies

L.R. Porter (Harvard University. MA, began his talk by giving a brief history

USA) of IS

thought. He mentioned that in the mid-fifties organisations embraced computer technology, after which the concept of total systems emerged in the 60’s, though few of these systems proved to be successful. Porter then gave a short outline of the technology in a broader context. While IS successfully developed structured applications significant problems arose in the attempt to build systems to support management decision-making. Also, as information technology became more pervasive IS faced increased difficulty in maintaining systems that

would

meet

the divergent

needs

of their

vari-

ous users. User-expectations i.,.irease as users gain experience with both mainframe and PC-based information systems. They have discovered new ways to use information technology. What must be provided are tools that are adaptable to changing needs, the speaker said. This means that the type of systems which organisations should be installing should be based on intensive technology. Porter focussed further on the individual employee and the organisation. He discussed these two categories in relation to task support, office

process automation and role enhancement. He concluded by observing that management practices are still rooted in an obsolete industrial model. However, IS management should view their role as that of coordinating and consulting and not of being the sole suppliers of information resources, Porter said. The IS function and End-Users One important facet of an IS group’s distribution and technology transfer activities involves establishing a supportive infrastructure that promotes end-user computing. In order to encourage and facilitate this, linking mechanisms are used. In their paper R. W. Zmud and M.R. Lind (School of Business Administration, University of North Carolina, Chapel Hill, NC, USA) described the findings of a study aimed at examining this subject. Their main question was whether the linking mechanisms are key factors in determining to which extent end-users exploit information technologies, and how to characterise the organisations which make greater use of these. The indications referred to 21 organisations in relation to 12 currently used linking mechanisms. The findings suggest that the greater the use of formal organisational linking mechanisms, the greater the extent of end-user computing. Some linking mechanisms appear to be more effective than others. This is probably dependent on the degree of formality of the organisational environment. Some of the linking mechanisms are good for introducing radical technologies into an organisation or for inducing innovative management support aids. Zmud and Lind concluded that the questions raised by the study warrant further research in this area. Information

Centers

The lack of system usability by end-users is seen as a major problem inhibiting IS productivity by R. Harris (IBM, USA). Traditional systems are often inflexible and end-users have problems with the application backlog. Hurris presented a number of user-friendly tools to help users become self-sufficient with data, eliminate the costs and delays of paperwork and increase analytical capabilities in decision-making. He described how they could be integrated into an information center organisation which provides consultants with edu-

cation and management support thereby encouraging their use. He stressed the role of senior management commitment in this process.

III. Decision

Support

Systems

End- User developed DSS Rapid advances in information processing technology together with concurrent costs reductions have enabled inexperienced users to develop and operate their own Decision Support Systems (DSS). Users are enthousiastic about this because of the potential benefits and pay-offs (such as enhanced productivity and reduction in the implementation problems associated with the traditional analyst driven approaches). There are, however, potential organisational risks in the areas of acquisition, design and development, and data-management. M. A/k (College of Business Administration, University of Houston, TX, USA) provided a number of scenarios to illustrate and examine the risks and unfavourable side-effects that occur in conjunction with user-developed DSS. For example, lack of practical DSS design experience amongst other things can lead to low-quality DSS which can have a negative effect on organisational decisions and activities. However. well developed DSS can boost productivity by allowing managers to manipulate and analyse information with great effectiveness and speed. In conclusion A&i suggested specific quality control methods and steps aimed at reduction and management of the risks. The methods mentioned include audit and review teams, organisational and management policies, support and training facilities, and hardware/software techniques. A/uc!i pointed out that the essence of user-developed DSS quality control is to obtain a good balance between control procedures for risk reduction, flexibility towards innovation and experimentation. DSS for Remote Multi-Person

Decisions

Certain trends can be expected in the future which will create a need for more and quicker Multi-Person Decisions (MPD’s). While it is the turbulence of the business environment and the

fact of decision-makers as a speciality which generate these trends, managers want shorter and more effective’meetings and the need for computerised support of organisational decision-making is present. A technology-based solution to the problem is one where, via terminals and the use of shared screens, decision-makers exchange their expertise. Alternatively. the use of DSS that distributes multi-person decision making means that the meeting can be done away with. In order to extend the single-user DSS to Group DSS, to DSS for communication and arbitration. high level communication facilities must be designed. combined with traditional DSS components. In their talk M. Jut-lie (Graduate School of Business Administration, University of New York, NY. USA), M. Tnu$fik Jelussi (Indiana University. Bloomington. IN, USA) and X. Tune Bui (Naval Postgraduate School. Monterey. USA) focussed on the data sharing and communication needs of different functions in MPDSS. They presented a straightforward approach in that they reviewed the traditional single-user DSS concept and discussed a sequence of architectures for remote MPDSS (from simple data-sharing right up to mediation support for group leaders). They then proposed some possible architectures in a microframe setting and illustrated the concrete use of theoretical concepts by relating their experiences in designing micromainframe DSS for a European car manufacturer. They concluded by summarizing the implementation problems for MPDSS and addressed some important problems in remote decision situations where communication directly with other group members may be difficult. Idtwt[fvrng Strutegic, Opportunities for DSS

DSS is in many ways a design concept, a combination of computer technology and design methodology. It draws on theories that encourage new roles for the user in a restructured relationship between the analyst and the user who is seen as a learner. A DSS is able to influence the user’s beliefs and assumptions. and to improve the capacity to take effective action (behaviour). The methodology provides for increased participation of the user and helps users to converge to a system they feel meets their needs. In his talk J.C. Henderson (Sloan School of Management, MIT, Cambridge, MA. USA) ex-

pressed his support for the concept that the enduser population (and particularly the DSS group) must begin to address ‘decisions that matter’. He focussed on a design approach that will allow users to build systems for such decisions. It incorporates design techniques that focus on beliefs, assumptions, behaviours and critical processes, and it also raises the level of the design context. The main issue is not that concepts of how to build DSS should be radically changed, but that it is necessary to better direct investments in where DSS is built, according to the speaker. In the second part of his lecture Henderson discussed how current DSS planning and design methodologies can be extended. He also provided an extension to the planning process that uses concepts relating to competitive advantage to identify further opportunities for investing in strategically important end-users systems. Henderson rounded off his talk by summarizing his viewpoint, namely that the proposed framework provides a starting point to identify strategically important decision processes and support activities while it is also a means to examine beliefs and behaviour.

IV. Communications

Design

Building Telecommunicutions

Infrustructure

During the 80’s and the 90’s information technology will become a key factor in business planning. but the vocabulary and a framework for planning and decision-making is absent, said P.G. W Keen (London School of Business, UK). Keen based his lecture on his recent book ” Business without Bounds: Telecommunications and Business Strategy” (Ballinger/ Harper & Row, 1985) and covered the chapters systematically in his talk. He brought to the fore that the aim of the book is to help senior managers play an active role in exploiting telecommunications for business opportunities. He noted that a major stumbling block has been the cultural. organisational and attitudinal gap between technicians and business people. Exploiting telecommunications for business strategy involves a mix of technical, organisational, financial and business issues but it is not clear what the trade-offs are.

Keen argued in detail the case for large investments in the communications infrastructure that defines the highways to carry a new range of business products and services. He also addressed the topic of how to exploit the opportunities that have been opened by telecommunications. He described concrete examples from companies that already have gained clear strategic advantage. According to Keen, the most strategic issue is building an infrastructure that will provide the information highways that carry or facilitate the business traffic, move information and coordinate key activities. He illustrated his ideas on strategy by describing the experiences of a manufacturing company. Furthermore he discussed the components a telecommunications architecture and the issue of standards. He concluded by describing an approach which argues the business case for radical organisational moves to exploit the business opportunities of telecommunications.

Large Networks

of PC’s

In this lecture B. Garish (Graduate School of Management, University of Rochester, NY. USA) put forward that users of personal computers initially tended to prefer stand-alone stations. Later they discovered that in order to produce higher quality output, it is not acceptable to invest in devices such as hardcopy output devices for example, since the cost of peripheral devices could easily surpass the cost of the PC. A microeconomic analysis of the user incentives to pay for and use an advanced PC attached to a large-scale local area network (LAN) shows that this combination leads to a situation which encourages users to cooperate and join such systems. The marginal cost per user of providing shared services decreases with the increase in the number of users sharing that service. The utility offered to network users increases with the number of network subscribers. Gaoish illustrated the point with an example of online mail services. Following on this Garish discussed the three basic topologies of LAN’s and summarized the major characteristics of transmission media. The issue with which the lecture concluded was that of integrated communication services. Gaoish finalised the lecture with an outline of the costs associated with establishing a LAN.

V. Task Distribution Task Allocation

Strategies

in Micro-Mainframe

Applications

Applications software currently on the market is characterised by a notion of centralisation. Many users are not satisfied with the degree of user comfort in these systems and want more flexibility in the construction of reports and obtaining support for their decision-making. The possibility of using micros raises user’s hopes of becoming less dependent upon inflexible DP systems and of being able to construct ad hoc reports at will. However, there is the danger that micros will only fill the gaps of classical centrally organised DP systems. A combination of advantages and disadvantages must be decided upon; whether the micro is to be used as an additional analytic instrument or whether it is to serve as an integral component of a new conceptualisation of data processing. A. W. Scheer (University of Saarland, F.R.G.) dealt on this in his lecture and investigated the fundamental criteria for the employment of microcomputers within the framework of comprehensive applications systems. He also reviewed the kind of problems particularly suited to microcomputers and the form which solutions to these problems might take. Distributed

Databases

and Distributed

Processing

The integration of PC and mainframe processing places new requirements on the applications and systems that support them. The main challenge is to retain the PC autonomy together with the corporate and department-level-data integrity, according to D.R. Rim (Computer Corporation of America, USA). The end-user needs to be able to transform his data into relational tables and other formats of the user’s choosing. The important aspect of these transformations is that the user selects and controls the formats independent of an applications programmer or a database administrator. Ries then went on to outline some of the database management system requirements that are needed to support end-user capabilities. The support for the PC user-driven applications will require different types of support for simultaneous users. extensions for recovery and error reporting, and an increasing use of integrity constraints.

He concluded with a brief discussion of the current state of the systems to meet the capabilities. He based himself on the premise that the users and administrators should be provided with tools to facilitate the access and transfer of data and to insure the integrity and accountability of the database. Alternatively there is the possibility of more tightly integrated PC and mainframe software. In that case application packages and distributed databases will be developed that treat the PC as an intelligent terminal and do the distribution automatically with the types and styles of the PC’s being largely determined by what is compatible with mainframe processing.

VI. Future Directions: tional Perspectives A~LY~II~.~

in Workstution

Technological

and Organisa-

Design

Since the mid-seventies a great deal of discussion has occurred about the kind of workstation that would appear on the automated worker’s desk. It would be highly differentiated. as uncomputerlike as possible in appearance and its man/ machine interface. It would also he fairly expensive since it would need to be of high intelligence to compensate the naivetC of its user. It has turned out that most workstations are neither expensive, nor specialised but rather relatively inexpensive, general-purpose PCs. A. D. Wohl (Wohl Associates, Bala Cynwyd. PA, USA) set about systematically leading up to the designing of new models in the future. She covered the changes which have occurred and have influenced the nature of workstations. She also reviewed differences between the average and most advanced workstations of today and tomorrow. She concluded her lecture with a discussion of current design models and described the issues that might be explored in working toward new design models for workstations. e.g. the use of artificial intelligence, redefinable interfaces, machines to accommodate differing rates and levels of learning. Impliations

of Open Systems

Systems of interconnected and interdependent computers are qualitatively different from the rela-

tively isolated computers of the past. C. HeMiff (MIT, Cambridge, MA, USA) discussed some of the implications and constraints imposed by new developments such as (1) the growth of the numbers of PC’s, (2) the development of local and national electronic networks, (3) the widespread requirement of urm’s-length transactions among agencies and organisations. Open systems uncover important limitations in current approaches to artificial intelligence (such as problem spaces and logic programming). A new approach which is more like organizational design and management is therefore needed. Open systems are always subject to communications and constraints from outside, such as (I) continuous growth and evolution, (2) arm’s length relationships and decentralized decision-making, (3) inconsistency among knowledge bases. (4) need for negotiation among system components, and (5) the inadequacy of the closed-world assumption. Hewitt proceeded to show how arbiters are incorporated into an open system. He explained further how the indeterminacy of arbiters used in the implementation of open systems results in decisions that cannot be proved from knowledge of the structure of the computing system and its input. Using an example, he showed how decisions of an open system can be justified by agreement, even though they do not follow from any proof. Having introduced the issue of exploration vs. search he discussed Plunner, an early AI programming language. The ideas in Plunner have been generalised and perfected in subsequent programming languages but do not address all the needs of open systems. Hewitt then outlined some of the limitations that are inherent in the use of logic as a programming language for dealing reliably with empirical knowledge. In the remainder of his talk Hewitt confined himself to first order logic. He considered two conjectures in the problem of inconsistency, theories of meaning and critically reviewed the current Prolog language. The last aspects he covered were the need for due process reasoning and the information processing principles for the future. In addition to reflective problem solving there are many other principles that should be adopted to address the needs of constructing reliable intelligent systems that meet the needs of open systems. Hewitt concluded by saying that logical rea-

soning is a useful module in the repertoire of an intelligent system but not everything. Use of the principles used in designing and managing largescale organisations will be fundamental to the future of open systems, he asserted.

information

Processing in Organizations

G. P. Huber and R. R. McDaniel Jr. (University of Texis, Austin, TX, USA) commenced their lecture with an overview of the paradigms on which organisations have been based in the past and described three in short. They then proceeded to set forth a fourth paradigm formalised by themselves. They also discussed the societal and disciplinar forces that have necessitated the development of a Decision Making/Information Processing paradigm such as theirs, e.g. cultural unacceptability of currently available paradigms. According to them, expertise, the growing complexity of organisational decisions requiring participative decision-making have become the more important sources of power and influence in

organisations. They explained that their paradigm is focussed on creating structures that facilitate decision-making and information processing. Central to this is the matter of computing and communications technology. Following on this, they gave the audience an idea of the benefits to users of their paradigm and clarified their viewpoint with five examples. They outlined the domain of its applicability and elaborated generally on the paradigm and its central concepts. Furthermore they explicated a sampling of design guidelines applicable to the paradigm. They concluded by stating their belief that their paradigm would permit managers to take advantage of changing circumstances and in gaining a significant competitive edge. The Proceedings of the symposium have been published by the Center for Research on Information Systems, Computer Applications and Information Systems Area, Graduate School of Business Administration, New York University, 100 Trinity Place, New York, NY 10006, USA. iv + 352 pages.