The cash flows that the asset or project is expected to generate over its life

Prioritizing BI Opportunities (BIOs)

Steve Williams, in Business Intelligence Strategy and Big Data Analytics, 2016

5.3.2 Discounted Cash Flow ROI Model1

Despite the substantial methodological issues that arise from estimating future cash flows, most companies require some form of ROI analysis. Whatever the method of ROI, the ROI model is based on a web of assumptions about future cash flows. For BI, the assumptions need to be about how the particular BIO and associated BI applications are going to improve a specific business process in a way that increases revenue, reduces cost, or both. For example, predictive analytics can increase the acceptance of a marketing offer while reducing the cost of transmitting the offer by narrowing the target list to people or groups who have shown a propensity to accept offers. If you can tie the potential investment in analytics to a specific business process in a way that makes intuitive sense to experienced business leaders, you can generate an ROI model that is as rigorous as possible given the inherent limitations of quantified ROI analysis. Table 5.2 is an example of such a model. As noted earlier, the BIO with the highest ROI might not be selected as the top priority BIO for many good reasons. On the other hand, using the ROI model to prioritize the BIOs is a valid approach as well.

Table 5.2. An ROI Model Can Be Used to Predict the Economic Impact of BIOs

Return-on-Investment Worksheet
Projected Costs and Benefits
20132014201520162017Total
1.0 Costs ($)
1.1 BI projects
IT project team labor
Business project team labor
1.2 Infrastructure
IT project team labor
Software
Hardware
Professional services/external consultants/vendors
Costs subtotal $ – $ – $ – $ – $ – $ –
2.0 Benefits
2.1 Supply chain business intelligence
2.2 Cross-selling business intelligence
2.3 New product business intelligence
Benefits subtotal $ – $ – $ – $ – $ – $ –
NET benefits/costs
NET: $ –
NPV: $ –
Discount rate: 2.0%

Vignette: Building a Credible ROI Model for a BI Program

A financial services firm uses a standard discounted cash flow (DCF) model to evaluate all of its capital investments. As is often the case with DCF analysis, the capital and operating expense components of the investment are relatively easier to estimate than are the potential incremental revenues, cost reductions, and cash flows—that is, the benefits. To ensure that the assumptions used to project the benefits of the various BIOs were defensible and would make intuitive sense to senior executives, the BI program manager recruited a cross-functional team that included all of the business stakeholders for the BIOs plus a representative from the Chief Financial Officer’s organization. Over the course of several weeks, the business stakeholders for each individual BIO developed the assumptions about the economic benefits, and then obtained feedback from within their respective business functions. When the business case was pulled together, the BI program manager could submit it to the executive team with confidence. The DCF model was perceived by the executive team as a big improvement over previous approaches to justify the BI investment, which had relied upon vague assertions from well-known vendors about “customer intimacy” and “enhanced decision-making” and so forth. By tying benefits statements to specific BI process improvements and using defensible, business-driven assumptions about future cash flows, the BI team succeeded in gaining executive approval for the BI program.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128091982000051

Asset Pricing: Derivative Assets

S.M. Schaefer, in International Encyclopedia of the Social & Behavioral Sciences, 2001

3 Derivative Pricing vs. General Asset Pricing

The price of every risky asset, including derivatives, depends on agents' expectations of the future cash flows and on their willingness to bear risk; in other words, on their preferences. (For a description of modern asset pricing theory see Duffie 1996: Campbell 2000 reviews both the theory and empirical evidence.) What, then, is special about the problem of pricing derivatives as distinct from other risky assets? The key difference is that in a number of important cases, including the Black–Scholes model, while the prices of both the derivative and the underlying security depend on expectations and preferences, the relation between the two does not. In this case, given the price of the underlying security, the price of a derivative may be determined independently of agents' expectations of future cash flows and their preferences.

The economic principle that is used to establish the link between the price of a derivative and the price of its underlying asset is no-arbitrage. An arbitrage opportunity is defined as one that requires no capital, has a positive probability of a profit, and a zero probability of a loss. In other words, it provides ‘something for nothing’ and any rational investor, who prefers more to less, irrespective of their attitude towards risk, would choose to exploit such opportunities. In a well-functioning market, therefore, no arbitrage opportunities should exist and, in the economy studied by Black and Scholes, it turns out that the conditions which eliminate the possibility of arbitrage between a derivative and its underlying asset are also sufficient to determine the relation between their prices.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B008043076702266X

Real Option Theory and Monte Carlo Simulation

Caesar Wu, Rajkumar Buyya, in Cloud Data Centers and Cost Modeling, 2015

18.11.2 Sensitivity Analysis with Different Scenarios

Scenario analysis tests the NPV value under different circumstances. Here, we just investigate NPV values under the worst, best, and normal scenarios. It holds IRR and initial capex unchanged and investigates the variation of the future revenue income or future cash flow. It is not a minor adjustment but rather a radical change of the future DCF, such as –50% or +50%. This analysis will provide a subjective evaluation; different probabilities may occur (see Table 18.9).

Table 18.9. Scenario Sensitivity Test

Year20142015201620172018
Normal Case ($m) $24.15 $16.45 $32.20 $40.95 $42.70
Worst Case ($m) −50% $12.08 $8.23 $16.10 $20.48 $21.35
Best Case ($m) +50% $36.23 $24.68 $48.30 $61.43 $64.05
NPV Probability
Normal NPV ($m) $4.30 50%
Worst NPV ($m) −$34.25 30%
Best NPV ($m) $42.85 20%

If we assign different probabilities (50% for normal, 30% for worst, and 20% for best) to each scenario, we will achieve a so-called expected value:

ExpectedNPV=4.3×50%+(–34.25)×30%+$42.85×20%=$0.44million

Of course, we can assign any value of probabilities (10%, 25%, or 65% …) to different scenarios. It is subject to personal intuition or experience. If you don’t have enough experience for a particular market, such as the cloud service market, you adopt a trial and error method to verify and make sure the expected value makes sense for the MCS analysis.

Now, the issue we face is that all probabilities are decided subjectively. Which NPV is the right one, $4.3 million or $0.44 million? In order to resolve this issue, we can probably borrow the tool of Monte Carlo simulation (MCS). Of course, there are other methods available to solve the same problem, but based on our discussion in the Section 18.7.4.1 about what MCS is good for, the MCS tool is best fit for this issue.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128014134000180

GIS Methods and Techniques

Jochen Albrecht, in Comprehensive Geographic Information Systems, 2018

1.31.2.4.2.7.1 Net present value

A dollar earned today is worth more than a dollar earned one or more years from now. The NPV of a time series of cash flows, both incoming and outgoing, is defined as the sum of the present values (PVs) of the individual cash flows of the same entity.

In the case when all future cash flows are incoming and the only outflow of cash is the purchase price, the NPV is simply the PV of future cash flows minus the purchase price (which is its own PV). NPV is a standard method for using the time value of money to appraise long-term projects. Used for capital budgeting and widely used throughout economics, finance, and accounting, it measures the excess or shortfall of cash flows, in present value terms, once financing charges are met. NPV can be described as the “difference amount” between the sums of discounted cash inflows and cash outflows. It compares the present value of money today to the present value of money in the future, taking inflation and returns into account. The NPV of a sequence of cash flows takes as input the cash flows and a discount rate or discount curve and outputs a price. Each cash inflow/outflow is discounted back to its present value (PV). Then they are summed. Therefore, NPV is the sum of all terms.

NPV is an indicator of how much value an investment or project adds to the firm. With a particular project, if NPV is a positive value, the project is in the status of positive cash inflow in the time t. If NPV is a negative value, the project is in the status of discounted cash outflow in the time t. Sometimes, risky projects with a positive NPV could be accepted. This does not necessarily mean that they should be undertaken since NPV at the cost of capital may not account for opportunity cost (i.e., comparison with other available investments).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124095489096123

Accounting Measures

Michael Bromwich, in Encyclopedia of Social Measurement, 2005

The Balance Sheet

This shows the details of the firm's assets, such as land, property, and equipment, and its liabilities in a logical sequence. An asset is an identifiable item that has been either acquired or constructed in-house in the past, is expected to yield future cash flows, and is in the control of the firm in the sense that the firm may decide whether and how to dispose of the asset or use it. The accounting approach to valuing items is a bottom-up approach with which each item is valued individually using a method appropriate to that item and these values are aggregated to obtain valuations for classes of assets and liabilities and overall asset and liability values that will generally incorporate different valuation bases for different classes of accounting items.

Intangible assets are expected to generate future cash flows but have no physical form, such as items of intellectual property, which include patents, licenses, trademarks, and purchased goodwill (the amount by which the price of an acquisition of a company exceeds the net value of all the other assets of the acquired company). Many intangibles obtained in an acquisition are allowed on the balance sheet valued at what is called their “fair value”: what they would exchange for in an arm's length transaction by willing parties. Such fair values are not generally allowed to be changed to reflect changes in market prices over time. The inclusion of such intangibles is in contrast to the unwillingness to extend this treatment to either internally generated goodwill, which reflects the firm's value-added operations, or internally created intangible assets, such as brands. Often in a modern, growing, high-tech firm, the values of such intangible assets may dominate physical assets. The possibilities for inclusion would be very large, including self-created brands, information technology systems, customer databases, and a trained work force. Generally, regulators have resisted attempts to treat such items as assets. Proponents of inclusion argue that this means that many of the sources of company value go unrecognized in the accounts and must be treated as costs in the year incurred. It is argued that this is one reason the accounting value of firms may fall far below their stock exchange values, especially for high-tech companies and companies in a growth phase. This is true even today; compare the accounting and stock market values of Microsoft. This makes empirical work with accounting numbers very difficult. Where intangibles are important in such exercises, they must be estimated separately.

The value of an asset is either based on its original cost or in most accounting regimes, including the United Kingdom but not the United States, based on a revalued amount taking into account changes in prices over time. Any gains on holding assets increase the value of the firm in its balance sheet and not its operating profit, though such gains or losses are shown elsewhere in the profit-and-loss statement. Firms do not have to revalue assets and many firms may use different valuation bases for different asset classes. Any revaluation must be based on independent expert authority and such independent assessments must be obtained regularly. By not revaluing assets in the face of favorable price changes, firms build up hidden reserves of profits realized when the asset is sold or when used in production. Accounting regulators have acted to stop asset overvaluation by making assets subject to a test that their values in the accounts do not exceed either their selling value (which may be very low where assets are specific to the firm) or their value when used by the firm (an estimate by management that may be highly subjective).

It is generally agreed for useful empirical work that assets must be restated at their current market prices minus adjustments for existing depreciation and the assets' technologies relative to that currently available on the market. One example of the use of current cost is in calculating what is called Tobin's Q, the ratio between the firm's stock market value and the current cost of its net assets. One use of Tobin's Q is to argue that firms with a Q ratio of greater than 1 are making, at least temporarily, superprofits.

During the inflationary period of the mid-1970s to early 1980s, most leading accounting regimes allowed at least supplementary disclosure of generally reduced profits after allowing for current cost (adjusted replacement cost) depreciation and a balance sheet based on the increased current cost of assets. This movement fizzled out with the decline in inflation, the lack of recognition of current cost profits by tax authorities, managerial unhappiness with the generally lower profits disclosed by this type of accounting, and the need to maintain the current value of existing assets. More traditional accountants also argued that the revaluing of assets meant that profits would be reported (as an increase in the firm's value in the balance sheet) on revaluation and not when profits were realized either by use or by sale. Despite this, many experts believe that it is the intention of some accounting regulators (ASB and IASB) to move toward current cost accounting.

Regulators in most countries have acted and are planning to act further on off-balance items. Leased operating assets, such as airplanes, ships, and car fleets, do not have to appear on the balance sheet and only the annual cost of such leasing appears in the income statement. This understates the firm's capital employed often by vast amounts and thereby inflates returns on investment. This is planned to change shortly. Other ways of maintaining assets on the off-balance sheet include setting up a supposedly independent entity that owns the assets and charges the “parent” firm for their use. Similar approaches are available to keep debt and other financial liabilities out of the financial statements.

The treatment of financial assets (those denominated in monetary terms, which include financial instruments) differs between accounting regimes and classes of financial assets. Long-term loans to and investments in other companies not held for sale are generally shown at their original cost. Most other financial assets and liabilities are usually valued at their market prices, with gains and losses usually going to the balance sheet until realized by sale. Almost all regulators are agreed that all financial assets and liabilities should be shown at their market prices or estimated market prices, where such prices do not exist, which is the case for complex financial instruments (including many derivatives). Any gains and losses would go to the profit-and-loss account. Using market prices means that there will be no need for what is called “hedge accounting.” This accounting means that investments to cover other risks in the firm are kept at their original cost until the event that gives rise to the risk occurs, thereby hiding any gains or losses on the hedges. The aim is to let accounting “tell it as it is,” under the view that market prices give the best unbiased view of what financial items are worth. There are problems here. The approach treats some financial items differently than other items, such as physical assets and intangibles; it records gains and losses before they are realized by sale or by use, whereas this not allowed for other items; and it abandons to a degree the prudence and matching concepts and, perforce, often uses managerial estimates for complex items not traded on a good market, which conflicts with the wish of regulators to minimize managerial discretion.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0123693985004709

Intangible Assets: Concepts and Measurements

Baruch Lev, in Encyclopedia of Social Measurement, 2005

What Is Unique about Intangibles?

Intangibles differ from physical and financial (stocks, bonds) assets in two important aspects that have considerable implications for the management, valuation, and the financial reporting of intangibles.

Partial Excludability

Although the owner of a commercial building or a bond can enjoy to the fullest the benefits of these assets (is able to exclude fully nonowners from sharing in the benefits), the owners of patents, brands, or unique business processes, and the employers of trained personnel, can at best secure some of the benefits of these intangibles for a limited duration (partial excludability). Patents expire after 20 years, but in many cases are infringed upon by competitors long before expiration; there are thousands of patent and trademark infringement lawsuits filed every year. Brand values are fickle, given severe competition in most economic sectors and frequent changes in customers' tastes, as demonstrated by erstwhile leading brands, such as Xerox, Polaroid, or the airlines Pan Am and TWA, which are now financially struggling or bankrupt. Trained employees often shift employers, taking with them the investment in human capital made by employers. In short, the property rights over intangibles are not as tightly defined and secured as are those over physical and financial assets, challenging owners of intangibles to capture large and sustained shares of the benefits.

The difficulties of fully capturing the value of intangibles increase the riskiness of owning these assets (value dissipation) and complicate their valuation by investors, because valuation generally requires a reliable estimate of future cash flows to owners. As for corporate financial reporting to investors, accountants often claim that the absence of complete control over the benefits of intangibles disqualifies these assets from recognition as such in corporate balance sheets.

Nonmarketability

Although many physical and most financial assets are traded in competitive markets (stock exchanges, used car dealerships), intangibles are by and large not traded in active and transparent markets (i.e., those in which prices and volumes of trade are observable). To be sure, there are frequent transactions in some intangibles, particularly the licensing and sale of patents and occasionally of trademarks, but these transactions are not transparent—details of the deals are generally not publicly disclosed. The major reasons for the “nontradability” of intangibles are the incomplete property rights, mentioned previously, and serious information asymmetries, i.e., differences in knowledge about intangible assets between buyer and seller. Thus, for example, developers of drugs or software know about these intangibles and their profit potential much more than do outsiders, and it is difficult to convey to the fullest such information in a credible way. Trade in assets when owners possess a significant information advantage over potential buyers is often limited or nonexistent.

The nontradability of intangibles causes serious valuation problems for investors and managers, because valuation techniques are often based on “comparables,” which are observed values (prices) of similar assets traded in transparent markets. Nontradability also increases the risk of owning intangibles, given the difficulties or impossibility of selling them before or after completion of development (no exit strategy). For many accountants, the absence of markets disqualifies intangibles from being considered as assets in corporate balance sheets. Intangibles thus differ inherently from physical and financial assets, and the management, valuation, and financial reporting of intangible assets are challenging. Of particular concern in the early 21st century is the vulnerability of intangibles, as expressed by Federal Reserve chairman Alan Greenspan, in testimony (February 27, 2002) to the House of Representatives: “As the recent events surrounding Enron have highlighted, a firm is inherently fragile if its value added emanates more from conceptual [intangible] as distinct from physical assets. A physical asset, whether an office building or an automotive assembly plant, has the capability of producing goods even if the reputation of the managers of such facilities falls under a cloud. The rapidity of Enron's decline is an effective illustration of the vulnerability of a firm whose market value largely rests on capitalized reputation.”

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0123693985004710

Information Systems for Agile Manufacturing Environment in the Post-Industrial Stage

S. Subba Rao, A. Nahm, in Agile Manufacturing: The 21st Century Competitive Strategy, 2001

APPENDIX: ACRONYMS

AMIS

Agile manufacturing information system. Proposed by Song and Nagi [25].

AMT

Advanced manufacturing technology. Includes a variety of computerized technologies, such as computer-aided manufacturing (CAM) and computer-aided process planning (CAPP). These technologies can be combined into various types of integrated systems, such as flexible manufacturing systems (FMS) and computer-integrated manufacturing systems (CIM).

BPR

Business process re-engineering. A radical or breakthrough change in a business process, often making use of modern information technology.

CAD

Computer aided design. Product design using computer graphics.

CAPP

Computer aided process planning. A computer system that helps in selecting sequences of operations and machining conditions.

CIM

Computer integrated manufacturing. A system for linking a broad range of manufacturing activities through an integrating computer system.

CORBA

Common object request broker architecture. An evolving framework being developed by the Object Management Group to provide a common approach to systems interworking.

DBMS

Database management system. Groups of software used to set up and maintain a database that will allow users to call up the records they require. In some cases, DBMS also offer report and application-generating facilities.

DCE

Distributed computing environment. A set of definitions and components for distributed computing developed by the Open Software Foundation, an industry-led consortium.

DCF

Discounted cash flow. A technique to obtain the present value of a future project by multiplying the sum of the future cash flows by discount factors.

DCOM

Distributed component object model. Microsoft's system for spreading an application across more than one computer on a network. Built upon Windows NT 4.0 and Windows 98.

ERP

Enterprise resource planning. An MRP II system that ties customers with suppliers.

IP

Internet protocol. A connectionless (i.e. each packet looks after its own delivery), switching protocol. It provides packet routing, fragmentation and reassembly to support TCP. IP is defined in RFC 791.

IS/IT:

Information systems/Information technology. IT is a very general term coined in the 1970s to describe the application of computer science and electronics engineering to the specification, design and construction of information-rich systems.

ISO

International organization for standardization. Specialized international organization founded in Geneva in 1947 and concerned with standardization in all technical and non-technical fields except electrical and electronic engineering.

ISO/DIS:

Draft international standards of ISO. Revision of five ISO 9000 series.

LAN

Local area network. A communication network consisting of many computers (mostly personal computers and workstations) that is placed within a local area, such as a single building or company.

MCS

Manufacturing control systems. A group of information systems designed to control shop floor activities on a manufacturing plant.

MCSARCH

Manufacturing control systems architecture. Proposed by Aguirre et al. [22].

MMS

Manufacturing message specification. ISO approved international communication standard for messaging between various manufacturing machines.

MRP

Manufacturing requirements planning. Computer-based information system for ordering and scheduling of dependent-demand inventories.

MRP II:

Manufacturing resource planning. Expanded approach to production resource planning, involving other areas of a firm in the planning process, such as marketing and finance.

OPT

Optimized production technology. A scheduling software package that emphasizes identifying bottleneck operations and optimizing their use.

OSI

Open systems interconnection. The ISO reference model consisting of seven protocol layers. These are the application, presentation, session, transport, network, link and physical layers. The concept of the protocols is to provide manufacturers and suppliers of communications equipment with a standard that will provide reliable communications across a broad range of equipment types.

SAP

An ERP system, provided by the company with the same name. SAP stands for Systems, applications, and products.

STEP

Standard for the exchange of product model data. A neutral format for product definition data, which aims to eliminate the cost, complexity and time related to multiple CAD systems without pulling the plug on any of those systems. With STEP, companies can move data between dissimilar systems.

TCP

Transmission control protocol. The most common transport layer protocol used on Ethernet and the Internet. It was developed by DARPA. TCP is built on top of Internet protocol (IP) and the two are nearly always seen in combination as TCP/IP (which implies TCP running on top of IP). The TCP element adds reliable communication, flow control, multiplexing and connection-oriented communication to the basic IP transport. It is defined in RFC 793.

TCP/IP:

Transmission control protocol/Internet protocol. The set of data communication standards adopted, initially on the Internet, for interconnection of dissimilar networks and computing systems.

TQM

Total quality management. Management of an entire organization so that it excels in all aspects of products and services that are important to the customer.

WM

Workflow manager. An automated system that directs application programs’ access to information storage. A part of AMIS, proposed by Song and Nagi [25].

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780080435671500127

Social Measures of Firm Value

Violina P. Rindova, in Encyclopedia of Social Measurement, 2005

Financial versus Social Measures of Firm Value

Financial measures of the value of a firm are quantifications of the economic value of the assets a firm controls and the expected ability of a firm to deploy these assets in generating economic returns. Whereas a systematic overview of financial measures of firm value lies outside the scope of this article, the brief overview provided here serves as a basis for distinguishing between financial and social measures of firm value.

According to finance theory, the value of a firm is the sum of the value of all of its assets. Different definitions of firm value exist and are appropriate for different situations. For example, the liquidating value of the firm can be realized if its assets are sold separately from the organization that has been using them. In contrast, the going-concern value of the firm is its value as an operating business. Further, the accounting value of the firms' assets determines its book value, while the value at which these assets can be sold determines its market value. Because the firm has both a liquidating value and a going-concern value, the higher of the two constitutes the firm's market value. The market value of the firm as a going concern is calculated as sum of the current and projected cash flows of the firm, discounted at the firm's average cost of capital. Whereas the estimation of future cash flows and the cost of capital may involve various levels of elaboration in analyses, the basic logic of financial measures of firm value is that a firm's value reflects its potential to generate economic returns for its suppliers of capital. This logic is based on the assumption that the suppliers of capital are the residual claimants, who receive their returns after all other claimants (stakeholders) have been satisfied.

Social measures of firm value differ from the financial approaches in several ways. Financial approaches are concerned primarily with evaluation of the economic assets of the firm and their productive capacity to generate economic returns for the suppliers of capital. In contrast, social approaches to firm valuation view the firm as a social agent with whom individuals and other social entities form relationships and pursue a wide range of economic and noneconomic goals, such as quality of life, social justice, professional development, and safe and clean environments. Thus, social measures of firm value are concerned with the role of the firm as a social actor, the activities of which impact a variety of stakeholders, rather than a single stakeholder group. Consequently, social measures of firm value differ from financial measures in their concern with both economic and noneconomic outcomes associated with the operations of a firm, and with the effects of these operations on various stakeholders. It should be noted, however, that financial measures of firm value are a type of social measure as well. As Ralf Hybles has explained, “The schemas of finance and accounting present an overarching set of abstract categories and decision rules that coordinate across specialties and organizations” so that “the dual functions of finance and accounting together certify the economic legitimacy of every type of contemporary organization.”

Whereas financial measures are always explicit and quantitative, social measures take a variety of forms, i.e., explicit and implicit and quantitative and qualitative. Because they are quantitative and standardized, financial measures enable a high degree of comparability of firm value in dollar terms. However, they offer limited means for evaluating aspects of firms performance that are not readily measured in dollars, such as intangible assets, employee satisfaction, and innovativeness of research and development efforts. Social measures seek to capture these aspects of firm performance and do so through both quantitative and qualitative and explicit and implicit means. Examples of explicit quantitative social measures of firm value are various reputational rankings and ratings, such as those published by Fortune Magazine in its “Most Admired Corporations” surveys. Examples of explicit qualitative measures are various awards, such as the annual awards presented by R&D Magazine to companies making technologically significant products. Implicit quantitative measures of firm value are metrics that assess specific organizational practices, rather than the overall performance of the firm and its products. For example, the Investor Responsibility Research Center (IRRC), a not-for-profit organization, offers research for portfolio screening that focuses on firm practices related to waste generation and disposal of toxic and hazardous materials, and the environmental liabilities of a firm. Examples of implicit qualitative measures are various classification schemes that categorize competing firms in an industry, in terms such as “specialty” versus “mass” or “low-end” versus “high-end” producers. As these examples indicate, social measures of firm value can take a variety of forms, including rankings, ratings, awards, and expert and lay categorizations. Because of this diversity, social measures of firm value may be less readily identifiable and available to the general public, compared to financial measures. Further, the diversity of the forms of social measures also suggests that the term “measures” may not capture precisely the nature and forms of social evaluations of firm value. Such evaluations may be better understood as summary representations of business performance that contain explicit or implicit evaluative elements. Their importance in markets and their value to stakeholders derives from the fact that they summarize, and often institutionally “certify,” information about competing firms and thereby reduce stakeholder uncertainty with regard to exchanges with these firms. Therefore, the processes through which information intermediaries collect and process information about firms to generate such evaluative representations are an integral part of the validity and usefulness of such evaluations. The processes through which social measures of firm value are constructed are discussed next.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B012369398500548X

Business Architecture

Jamshid Gharajedaghi, in Systems Thinking (Third Edition), 2011

9.5.2 Measurement System

To develop an effective measurement system we need to deal iteratively with two elements: performance criteria and performance measures (Figure 9.19).

The cash flows that the asset or project is expected to generate over its life

Figure 9.19. Relation between performance criteria and performance measures.

9.5.2.1 Performance criteria

Performance criteria are the expression of what is to be measured and why (i.e., how success is defined). The selection process involves identifying dimensions and/or variables relevant to an enterprise's successful operation.

Relevancy is the most important concern in selecting performance variables. Traditionally, the overriding concern has been with the accuracy of measures. Because we find it difficult to accurately measure what we want, we have chosen to want what we can accurately measure. Unfortunately, the more accurate the measure of the wrong criteria, the faster the road to disaster. We are much better off with an approximation of relevant variables than with precise measurement of the wrong ones.

Viability of a business enterprise is an emergent property. It is the product of the interactions of various entities. It cannot be measured directly (i.e., by using any of the five senses). We can only measure its manifestation. Growth is the most popular one, but some prefer return on investment while others like net present value of future cash flows. Unfortunately, using the single manifestation of a phenomenon as the measure of an emergent property has proved misleading and very costly. For example, if a business is successful, chances are it will grow; however, growth alone does not mean that the business is successful. The same outcome (manifestation) could be produced by different means. Lousy acquisitions can produce high rates of growth but at the same time destroy the company.

Therefore, when we measure an emergent property by means of its manifestation we have to do it along several dimensions. For example, concern for people, when combined with the concern for production, has quite a different manifestation from the one without it. In his famous work, The Managerial Grid, Blake (1968) demonstrated how the nature of a variable in a “1.9 orientation” is different from the nature of the same variable in a “9.9 orientation.” Freedom without justice leads to chaos, while justice without freedom leads to tyranny.

9.5.2.2 Performance measures

Performance measures are the operational definition of each variable, that is, how each variable is to be measured specifically. For example, if we have identified capacity utilization as performance criteria, then turnover ratio might be designated as its measure. Now we would need a procedure for calculating turnover ratio (e.g., divide sales by assets, divide revenues by assets, or divide output by input).

An important consideration in selecting any measure is its simplicity. The cost of producing a measurement should not exceed the value of the information it generates. Although objective measures are preferable, if the cost of obtaining an objective measure is prohibitive, then use a subjective one. Remember that collective subjectivity is objectivity (provided that collectivity represents a variety of value systems). For example, in evaluating the performance of a gymnast we rely on the collective judgment of a number of different judges.

Development of effective performance measures is easier said than done. More often than not the operational definitions are left vague and ambiguous, even when the underlying concepts are relatively simple, such as minimizing the cost. The usual practice of allocating overhead to various operating units demonstrates how an innocent matter of a convenience produces unintended consequences.

The criteria for allocation of overhead are usually based on conventional wisdom. Factors such as space occupied by a unit, or the labor content of a production process, are among the most popular ones. Since overhead usually constitutes more than 40% of the total cost, then we should not be surprised to see that these variables (space and direct labor) are the ones targeted for cost reductions. The fact that allocation rules were only a convention does not matter anymore. Once the allocation criteria become a rule, their relation with generation of cost, by default, is assumed to be causal, as demonstrated by the following case.

A large supermarket chain decided to close down ten of its stores because the accounting system showed they were not covering their allocated overheads. Since the shutdown had no effect in reducing overhead, the remaining stores now had to carry a larger share of the overhead. This in turn put a few more of the stores in the red, and they too were subsequently closed. The company was gradually withdrawing from the market. Then a new design was developed. Each store became responsible for its own operation without having to worry about any artificial overhead allocated to it. The surplus generated by each store was then passed on to the corporation as the income of the executive office. This made the executive office a profit center responsible for managing its operation, or so-called overhead, within the bounds of its income and the profit it needed to generate to meet the cost of its capital.

This situation is by no means atypical. With the prevalence of allocation rules based on labor, pressure is unduly shifted to direct labor. The default reaction is to lay off productive manpower. If the police department is facing deficits, policemen are the first to be fired. If schools are in financial trouble, the number of teachers is reduced. Reduction of operating units does not automatically reduce overhead, as management seems to assume. On the contrary, it will increase the burden of the remaining units until the whole system comes to a halt. In the mid-1970s, when per capita income was the conventional measure of development, sudden increases in oil prices produced instantly developed nations. Since this was not acceptable, a new set of indicators had to be developed. We now have a whole series of indicators that substitute for development, such as per capita steel production, per capita consumption of fuel, and so forth. It is not surprising, then, to find national development policies aimed at improving these measures, usually at an incredible cost to the society at large. Yes, winning is fun. But to win, one has to keep score. And the way one keeps score defines the game.

9.5.2.3 Viability matrix

The viability matrix developed in Table 9.1 is a framework for identifying the relevant dimensions — the performance variables — for measuring a business's viability or the different aspects of an operation.

Table 9.1. Viability Matrix: Identifying Dimensions and Variables

Structure (inputs)Function (outputs)Environment (markets)Process (know-how)
Throughput Capacity utilization
Attributes of the Outputs:
Cost
Quality
Availability
Access mechanism
Throughput capability
Waste
Cycle time
Safety
Control
Profitability Reliability of demand
Synergy Default values of the culture Compatibility of performance criteria Credibility in the marketplace
Relations with:
Suppliers
Creditors
Customers
Value-chain analysis
Reward systems
Value-added ratio
Latency Bench strength
Product potency Market potential
Early warning system
Core knowledge Intensity of competition Planning process

The first dimension of this matrix identifies the variables that define the organization as a whole:

Structure (inputs)

Function (outputs)

Environment (markets)

Process (technology)

The second dimension of the viability matrix identifies the processes that define the totality of the management system:

Throughput (production of the outputs)

Synergy (management of interactions, adding value)

Latency (defining problems and designing solutions)

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012385915000009X

What is future cash flow?

It compares the present value of money today to the present value of money in the future, taking inflation and returns into account. The NPV of a sequence of cash flows takes as input the cash flows and a discount rate or discount curve and outputs a price.

Which of the following are considered relevant cash flows in the analysis of a project?

Answer and Explanation: The initial, operating, terminal, and incremental cash flows are the relevant cash flow reflected in the analysis of a capital budgeting project. The initial cash flow includes the initial cost of investment in the project and the increase in the working capital requirement.

What are cash flows quizlet?

Cash Flow. Cash flow is the difference between the amount of cash the company has at the beginning of an accounting period versus the amount of cash it has at the end of an accounting period. Cash flow represents, or is based upon, the operating activities of the business.

Is are the future cash flows expressed in terms of their present value?

Discounting models use discounted cash flows which are future cash flows expressed in terms of their present value. One discounting model is the net present value (NPV), which is the difference between the present value of the cash inflows and outflows associated with a project.