Finance + Tech: Exponential Capabilities Enabled by Technology
By Carl Nordman
Technology innovation over the years has contributed to elevating the speed, complexity and volatility of commerce globally. Technology has enabled process and productivity improvements in business, but more importantly has contributed to generating enormous amounts of data on every aspect of business performance.
To keep pace, today’s financial executives have access to vastly improved tools and practices available for performance management and analytics. This in turn has increased the efficacy of finance and elevated the role of the financial executive to one of strategic partner to the enterprise.
Finance practices throughout history have evolved in significant ways that have had a profound influence on commerce and culture across the world, arguably contributing to the emergence of global commerce and enabling the rise of modern capitalism (see sidebar on page 42).
As we reflect on FEI’s 80th anniversary and IBM Corp.’s centennial, it has indeed been a period of incredible and impressive technology innovation. During this time business and financial executives have been front and center in this evolution, benefitting from technology innovation and using it to improve business performance and outcomes, continuing a rich history of influencing the direction and growth of global commerce and, arguably, broader society.
The strategic role of today’s financial executives is more commonly being defined and enabled by the quality and speed of information — the result of 80 years of technology innovation — and its use to drive business performance insights. Finance’s increasing role and prominence in the enterprise is largely dependent upon the access to and quality of integrated performance information and value driven by the incisiveness of resulting business insights. A key future challenge will be how to further leverage this role with the emergence of the “Age of Big Data.”
Big Data refers to the increasing growth and complexity of data and information in terms of volume, speed and variety. Finance is uniquely positioned at the hub of enterprise financial, operational and risk data to best take advantage of emerging trends in Big Data.
Origins of Modern Accounting and Finance Tools
In the early 1900s, population growth, speed of commerce and economic complexity were increasing rapidly in the United States. As a consequence, new means to perform accounting efficiently and accurately were imperative.
The punched card tabulator was the pivotal innovation in the Machine Age that led to vast improvements in how accounting was performed. Its development can be traced back to the 1880s, when Herman Hollerith, who worked at the U.S. Census Bureau, had an idea for a machine using punch cards that could count and sort census results far faster than human clerks.
The bureau’s challenge then was that counting the census by hand could no longer be completed in time to be useful, and, consequently, the first tabulating machines were developed and used to count the 1890 census.
In 1896 came the introduction of the integrating tabulator, which could add numbers and was a clear advance over just counting the holes. Though the advantages were immediately apparent for accounting and tracking inventory, it would be quite some time before it would become mainstream and large scale.
In the 1930s, formation of the Social Security Administration (SSA) was the catalyst for mechanical accounting devices to really take off. The SSA’s impact on American business created the largest accounting initiative in history as 3.5 million businesses would need to establish payroll withholding systems for 27 million employees to comply with the rules; the largest companies would need high-speed accounting machines to do it.
During the 1930s and ‘40s, increasingly sophisticated electric accounting machines were marketed by International Business Machine Corp.’s Electric Accounting Machine division, which operated in 79 countries throughout the world and maintained some 80 sales offices throughout the U.S.
There was widespread adoption of punch card and tabulating machine technology for accounting-related processing, a trend that continued into the 1960s and ‘70s, when punch cards were the backbone of computing — until the introduction of more reliable and cost-effective technologies.
||The longevity of punch cards — for more than 50 years — for input, processing and output across the history of mechanical tabulators, then electronic calculators and computers, is impressive. By the 1970s, punch cards provided the backbone for these basic business functions. Their longevity is truly a testament to their general reliability and cost-effectiveness compared to other options available at that time.
Among machines designed to work with punch cards were keypunches (to encode the cards), sorters
(to put them in proper order for processing) and readers (like tabulators and calculators, to aggregate and output information). Though at the time an extremely reliable medium, decades later the increased use of punch cards began to create issues, as computer innovation began to push the limits of the technology’s ability to keep up.
Innovation in computer processing technology, including vacuum tubes and transistor technology, dramatically increased processing speed, taxing the practicality of punch cards as an input, storage and output device. Punch cards essentially became too slow to keep up with new processing power available in later generations of computers.
First-Generation Computers — Vacuum-tube Based
The period of first-generation computers based on vacuum tube technology lasted from approximately 1937 to 1955. Vacuum tube technology would have limited commercial uses due to cost and reliability issues, but set the stage for future generations of computers to go mainstream.
This new technology marked a transition from mechanical to electronic computing that brought greater processing power and capacity. Further, the integration of peripheral components into this new generation of computer foreshadowed future computer “systems” that married keyboards, printers, magnetic storage and punch card devices with powerful central processing units.
In 1951, UNIVAC 1 (Universal Automatic Computer) was developed, perhaps one of the best-known first-generation computers that was commercially successful on a limited basis (46 were delivered to institutions).
UNIVAC had 5,600 vacuum tubes, 18,000 crystal diodes and integrated magnetic tape and a typewriter. The first few installations of UNIVAC included the U.S. Census Bureau and Prudential Insurance Co. In 1954, General Electric Co. developed the first industrial payroll application on the UNIVAC.
The evolution from vacuum tube to transistor, then integrated circuit and finally micro-processor technology, greatly improved reliability and drove down costs, bringing an affordable value proposition to mainstream businesses.
Future Generations of Computing Technology
Second-generation computers based on transistors — invented at Bell Labs in 1947 by William Shockley, John Bardeen and Walter Brattain — swapped the unreliable vacuum tube with this electronics breakthrough. Transistors brought greater reliability, required less power consumption and space and were much cheaper to manufacture.
This transistor-based generation of computers only lasted about seven years (1956-63), largely because computer circuitry based on individual transistors became too complex to design and build. But next-generation innovations solved the problem.
Moore’s Law — advanced by Gordon Moore, founder of Intel Corp. —– and the commercialization of computing really kicked off with third-generation computers. Developed from 1964-71, these were based on circuits of miniaturized transistors integrated into silicon chips or semiconductors, hence the term “integrated circuit.” The even lower cost and greater scalability of this technology ultimately led to today’s fourth-generation computers based on micro-processors.
The continued improvements in processing power, scale and costs are the basis of Moore’s Law. Moore predicted the number of transistors on a silicon wafer chip would double approximately every two years.
His theorem addresses the cost aspect as well, suggesting decreasing costs per transistor are achieved as more transistors are built on the same silicon wafer. Moore’s Law, which was published in his original paper in 1965, continues to hold true.
Computing Goes Mainstream For Business
In the 1960s, numerous innovations converged to produce some of the most advanced general use and peripheral-rich computer systems that had the right combination of reliability, availability and cost to be attractive to mainstream business.
Several things contributed to evolving general business use platforms, including:
■ Integrated circuit technology, with its increasing reliability and reduced costs, that brought next-generation computing into the mainstream;
■ Improved peripheral device integration with better input, storage and output capabilities, like video display terminals (VDT), tape drives, disk drives and printers;
■ Business-oriented programming languages like COBOL, ALGOL and FORTRAN, which provided the means to develop complex business applications like accounting systems and eventually enterprise resource planning (ERP);
■ Development of business and accounting applications, built for mainframe systems available for “time share” and minicomputers or “midrange” systems; and
■ Connectivity enabled through telecommunications technology and the emergence of the Internet.
Magnetic Tape and Disk Drives Break the Data Logjam
General purpose business applications require large amounts of data be fed into the processor for calculations, so fast input, storage and output media are essential. Early computing innovation was led by the scientific community, where processing power was more important to handle complex calculations. Processor power and speed were the early focus, for which punch cards were fine.
Data storage technology was far less important until the development of business-oriented applications, which in turn drove innovation in magnetic media. Magnetic tape and disk drives changed the computing landscape by making storage of a large amount of data more practical and accessible.
For instance, a single reel of 1/2-inch magnetic tape could store as much information as 10,000 punch cards, and came in lengths measuring anywhere from 2,400 to 4,800 feet. The long length presented plenty of opportunities for tears and breaks.
The first commercial uses of hard disk drives began in 1956 with IBM’s 305 RAMAC (Random Access Method of Accounting and Control). Weighing in at over a ton, and requiring 16 square feet of floor space, the disk-drive capacity was an astounding (at the time) 5 megabytes, the equivalent of about 64,000 punch cards.
IBM marketed this directly to businesses for accounting functions and it was a commercial success. Finance and accounting functions experienced enormous efficiency improvements due to magnetic tape and disk drive technology.
Today, storage is not just highly commoditized; it is almost free, compared to its cost when first introduced.
Early Business and Accounting ‘Apps’
The availability of mainframe-based accounting programs coincides with new solid state mainframe systems and the emergence of business-oriented programming languages such as COBOL.
Common accounting and other business applications developed for mainframe and minicomputer platforms included sub-ledger systems for payroll, payables, time and expense tracking, billing and inventory management. General ledger applications with reporting modules were also developed, to support period close and financial reporting needs.
These applications brought orders of magnitude improvements in processing and information management to finance. By today’s standards, these applications were not well integrated and all relied heavily on manual entry, punch cards and batch processing. At the time, though, they represented a huge leap in capability, contributing to better financial controls and improved efficiency across record to report.
For many companies, however, cost was a barrier to adoption. Vendors began to offer the applications in a “time-sharing” environment. A business would be billed for how much time its applications used on a common mainframe computer, which is not unlike the concept of today’s cloud computing.
These early financial applications typically relied on manual steps to take sub-ledger activity and post it into a general ledger, whether through direct keystroke on a VDT or using punch cards to run batch jobs, as part of the close cycle.
Tools Converge Into a Ubiquitous Environment
The birth of the personal computer in 1982 spawned the adoption of PC-based business software programs for companies of all sizes. From an accounting systems standpoint, early PCs were usually either standalone small business applications or terminal emulators for the existing mainframe and minicomputer applications.
From a financial executive perspective, these innovations were a double-edged sword:
■ As networking and file-sharing technologies improved, access to corporate data by finance personnel exploded;
■ File management technologies spawned the “data management” function in many companies, where the primary objective was to support individual requests for ad hoc data from mainframe systems;
■ File management and early databases extended data management capabilities to the finance and accounting desktop; and
■ Spreadsheet software and database tools provided a white space to load data, customize calculations, perform analysis, develop trends and do scenario planning.
On the one hand, the almost free-for-all access to corporate data drove a productivity improvement that is hard to comprehend. Access granted to corporate information (particularly financial) was controlled by policy and with system security and privileges granted, so it was not too out of control from a security perspective.
But finance personnel in analyst roles across the business could now operate in a new, very independent and highly flexible environment. Early on — absent formal governance and discipline around data, information and analysis — financial executives quickly leveraged this to perform complex analysis of business performance in an isolated PC-based environment.
The result, however, was multiple versions of the truth with limited transparency — a challenge many finance organizations today are all too familiar with. Further, the proliferation of PCs and network-based standalone and client/server systems were being developed as point solutions to meet specific departmental challenges, with finance a leader in leveraging this new flexibility. The result, again, was structural complexity across business information, processes and systems.
By the 1980s, more advanced database management systems (DBMS) were being developed as the underpinning of many mainframe, client/server and PC-based applications.
And by the early 1990s, very robust relational databases began to emerge. These databases enabled the development of client/server applications that were cheaper still, compared to mainframe and minicomputer systems. Data and functionality-rich business applications built on top of new relational databases and client/server technology included ERP.
Contemporary database capabilities have extended beyond the management of discrete, structured data, such as financial and operational data. Today they support applications that can pull and organize a variety of structured and unstructured data, images, video and pretty much anything else.
Connectivity among computer systems and between people and these systems was an essential development that ultimately led to the rise of the Internet. Telecommunications technology provided the road for data and information to travel, contributing to the rise in broad-based computing. Video display terminals, personal computers and now mobile devices are reliant on this technology.
With the advent of the Internet, browser technology using standard HTML protocols emerged. Business application vendors retooled their business applications to leverage this new interface. The availability and access to data increased by orders of magnitude. The data and information floodgates opened wide, offering access through more efficient and intuitive methods, including search engines and powerful analytical software tools.
The Evolution of Enterprise Resource Planning
The genesis of modern enterprise resource planning systems can be traced back to inventory management and control (IMC) software developed in the 1960s, which used COBOL, ALGOL and FORTRAN to run on mainframe systems. Memory limitations on mainframe computer systems starting in the ‘60s drove programmers to creatively “short-cut” program code, such as using two digits for the year instead of four, leading to the infamous Y2K scare 30 years later.
Anticipating potential system issues with these shortcuts, when 1999 became 2000, Y2K concerns launched huge business-application remediation efforts that cost enormous sums of money. Perhaps more importantly, it spread the adoption of another alternative, implementing ERP applications.
In the 1970s, applications were written to support manufacturing processes and related data processing requirements. These material resource planning (MRP) applications leveraged earlier IMC software. They were orders of magnitude more complex and came much closer to mirroring an actual business process than had ever been done before.
The MRP application would synchronize inputs, activities and outputs across the manufacturing supply chain, synchronizing raw material and work-in-process inventory (WIP) through the production process.
Further innovation in the 1980s led to building a layer of workflow management capability, improving process controls and standardizing them across product planning, supply, manufacturing, inventory control, sales and product distribution modules.
By this point, MRP systems had combined a host of features across planning, billing, supply chain, inventory management and payables, driving some of the first cross-functional integration of process and data functions into a single business application architecture.
Then, by the 1990s, modules for other corporate service functions were developed, including human resources, payroll, time and expense, general ledger, consolidation and reporting. While still very modular, these applications integrated better than anything that had been designed before, due to advances in workflow, security, controls and database technology.
The single greatest challenge for many companies remains the business case for implementing ERP solutions. Based on historical anecdotes over the years, there are mixed views as to whether the investment yields an attractive return on investment.
However, recent studies provide empirical evidence that an expanded view of the benefits case strongly supports the ERP investment, properly managed for scope and risk. ERPs improve the integration of processes and data from across the enterprise and into finance, contributing to a single version of the financial truth, which is an essential information component needed to tackle the future’s Big Data challenges.
Analytics for Finance
Powerful analytical software for finance — a fairly recent development over the past 20 years — has evolved from developments in relational databases and the application of more intuitive and flexible front-end tools, coupled with ubiquitous access to both internal corporate data and external macro-economic and industry data.
The front end to today’s analytical tools is the key, providing business users the flexibility to relate disparate data, structure analyses, feed in assumptions and present the results, all from a common data source and without having to learn complex programming languages. Basically, analytics has become ubiquitous as well.
Finance’s role — to perform analysis and drive performance insights — is highly dependent upon this kind of flexibility. As evidenced by some of the issues arising from the advent of the PC and standalone computing, today’s analytics tools resolve issues of transparency and multiple versions of the truth.
Properly designed and implemented, analytical tools for finance will run on a common set of corporate data governed by a codified set of rules and definitions, and provide the flexibility to cut, slice, drill down, pivot and drive assumptions into scenario analysis in ways that are only limited by the analyst’s creativity.
Among the familiar applications pertinent to finance are operational analytics, multi-dimensional profitability reports, scorecards and performance dashboards.
The top agenda item for many chief financial officers is improving forecasting, particularly in the medium- and long-term. Yet, based on numerous surveys over the past 12-18 months, most CFOs are highly dissatisfied with their enterprises’ operational and financial forecasting capabilities.
At the extreme end of analytical maturity for finance is embedding risk-related data into the analytical platform to drive scenario analysis on the financial impacts of various risk events in order to prepare risk mitigation strategies. The future challenge for finance in the analytics will be to succeed in managing — and not drowning in — Big Data. And the key to preparedness is to have built a foundation of common practices, processes and transparency so as to leverage it successfully.
And though many financial executives are buried in day-to-day operations issues — accounting, compliance, regulations and more — the solution is the strategic management of data, which has raised the game and will continue to do so in the future.
The Path Forward: Leveraging ‘Big Data’
This brief peek back in history suggests the profession can be credited with helping form and then institutionalize — with accounting practices — the concepts of wealth and numbers, writing, the development of money, banking and debt.
Codified or not, these methods and practices have been the primary tools to enable the progress of civilization and the emergence of commerce. The role for finance over most of this history has been predominantly as scribe and recorder, whose science lacked codification until about 520 years ago, and used manual and primitive tools until technology innovation beginning 80 years ago began to raise the game.
Though the last 80 years have led to more complexity, speed and volatility of business, they have also enabled better standards and efficiency in process, data and as a consequence immensely improved business management practices. All these innovations have converged to allow finance unprecedented access to vast amounts of data, resulting in new opportunities to assess, manage and invest scarce capital resources into the best commercial opportunities.
This is what has raised the game for financial executives, who are increasingly being called upon to help cope with and manage complexity and business performance within their companies. The importance of the profession has been raised to unprecedented levels.
To cope with the increased complexity, speed and volatility of commerce brought on by technology innovation, financial executives need to focus on training their workforce to appreciate and embrace a broader set of responsibilities and challenges commensurate with changes across the broader business ecosystem. These include better technical skills in finance and risk management, deeper industry and operations knowledge and, naturally, analytics competency.
Carl Nordman is global leader of the Financial Management Research team in the IBM Institute for Business Value, responsible for conducting primary research and publishing fact-based thought leadership for finance and operations executives. He has extensive experience advising CFOs and senior financial executives on finance and operations transformation. As FEI celebrates its 80th anniversary, see how IBM is celebrating its 100th year (www.ibm.com/100).