Swipe to navigate through the chapters of this book
There is debate among economists and skeptics from other disciplines on the significance of the Information Age. For the last fifty years, Moore’s law has accurately predicted the exponential growth of computing capacity. If exponential growth continues, exceedingly rapid growth will occur at some point, but when the curve will become steep is hard to predict. IT has seen rapid growth in the first decade of the twenty-first century. The rate of change seems to be increasing and IT is penetrating into lives in ways that were impossible a few years ago. IT may have reached escape velocity. Service-orientation, cloud, and IT integration are critical in harnessing the blast of new capacity.
Please log in to get access to this content
To get access to this content you need the following product:
The term IT is used differently in different contexts. Among engineers, it usually refers to the combination of hardware and software that makes up networked computer systems. In service management, the term usually also includes the department or group that is charged with developing and maintaining these systems and includes people, processes, and governance. IT is often also used as a general term for computing disciplines. In the present work, IT is almost always used in the service management sense.
Rich Miller, Data Center Knowledge, “Amazon: 762 Billion Objects Stored on S3 Cloud,” http://www.datacenterknowledge.com/archives/2012/01/31/amazon-762-billion-objects-stored-on-s3-cloud/, January 31, 2012.
Rich Miller, Data Center Knowledge, “Facebook Builds Exabyte Data Centers for Cold Storage,” http://www.datacenterknowledge.com/archives/2013/01/18/facebook-builds-new-data-centers-for-cold-storage/, January 18, 2012.
The estimate is imprecise. See Leslie Johnston, Library of Congress, The Signal: Digital Preservation, “A ‘Library of Congress’ Worth of Data: It’s All in How You Define It,” http://blogs.loc.gov/digitalpreservation/2012/04/a-library-of-congress-worth-of-data-its-all-in-how-you-define-it/, April, 2012.
Big data has no crisp definition. It was coined in response to the increased size and changed nature of data that has begun to be collected and processed in the last few years. Generally, big data has three characteristics: it is large, measured in terabytes, petabytes, and larger units; to be useful, it must be processed rapidly; and it may not be in traditionally structured form.
Use of paper checks by consumers has declined sharply, but the Federal Reserve estimates that they will survive for another decade. Behind the scenes, paper has almost completely disappeared. Prior to September 11, 2001, paper checks were hauled by truck and airplane from bank to bank for clearing. Aircraft grounding following 9/11 brought a near crisis in the financial system by delaying check clearing. Regulations were changed to allow electronic transfers of images. A decade later, almost all physical movement of paper checks has been eliminated. See David B. Humphrey and Robert Hunt, “Getting Rid of Paper: Savings from Check 21,” working paper 12-12, Philadelphia: Federal Reserve Bank of Philadelphia, http://www.philadelphiafed.org/research-and-data/publications/working-papers/2012/wp12-12.pdf, May 2012.
Military computing has taken a new twist recently as cyberwarfare has transformed computing from a tool for developing weapons into both a formidable weapon in itself and a valuable and ubiquitous asset that must be protected. This transformation is important in assessing the growing significance of the information age and IT integration.
Von Neumann’s original description of a programmable computer appeared as “First Draft of a Report on the EDVAC” was published in 1945. The document was published prematurely and the concept inadvertently became unpatentable. A reproduction of the document can be found at http://virtualtravelog.net.s115267.gridserver.com/wp/wp-content/media/2003-08-TheFirstDraft.pdf
The priority claims of EDVAC (electronic discrete variable automatic computer) and the slightly earlier ENIAC (electrical numerical integrator and computer) are controversial. The ENIAC was programmable in the sense that its modular design permitted reprograming by physical rearranging jumpers and cables. The EDVAC could be reprogrammed without physically changing the hardware. The EDVAC was designed by the same team that designed the ENIAC and construction of the EDVAC began before the ENIAC was complete. At the core of the controversy, von Neumann was a latecomer to the team. Although only von Neumann’s name appears on the EDVAC design paper, other members of the team assert that the EDVAC concepts were developed prior to von Neumann’s participation in the team.
Douglas F. Parkhill, The Challenge of the Computer Utility (Reading, MA: Addison-Wesley, 1966).
Speculating on potential walls to Moore’s law is interesting. Resource depletion is a possibility. Silicon (sand) is not likely to be a limiting factor, but perhaps other elements that are used in trace quantities in computer chips may limit growth. Ray Kurzweil, suggests a different kind of limit, at which computer intelligence completely exceeds human intelligence ( The Singularity Is Near: When Humans Transcend Biology, New York: Penguin, 2005).
Numbers are scarce, but Internet companies such as Facebook clearly have the largest datacenters. See Rich Miller, “Who Has the Most Web Servers?,” http://www.datacenterknowledge.com/archives/2009/05/14/whos-got-the-most-web-servers/ , May 14, 2009.
Many dispute the significance and benefits of social networking and the Internet in general. Evgeny Morozov is an articulate exponent of the anti-triumphalist view of the Internet. (See Noam Cohen, the New York Times, “The Internet’s Verbal Contrarian,” http://www.nytimes.com/2013/08/15/business/media/the-internets-verbal-contrarian.html?pagewanted=all&_r=0, August 14, 2013.) Morozov focuses his critique on the political influence of technology rather than the technology itself—in other words, what is done with the Internet rather than what it is.
This is not to dismiss the many more practical uses of social networking today, but the prodigious consumption of computing resources on pure sentiment is an indication of the wealth of resources available.
Peter Mell and Timothy Grance, “The NIST Definition of Cloud Computing: Recommendations of the National Institute of Standards and Technology,” http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf, September 2012.
ITIL will come up often in this book. ITIL began as collection of documents on IT published to guide IT practice in the UK government. It has evolved into a collection of best practices that are recognized globally as a compendium of practices for service management of IT.
David Cannon, David Wheeldon, Shirley Lacy, and Ashley Hanna, ITIL Service Strategy (London: TSO, 2011), p. 451.
Redundant arrays of independent disks (RAIDs) are arrays of hard drives that work in coordination to provide greater reliability and performance than a single drive. The data distribution pattern of the array can be configured for varying tradeoffs between performance, error resilience, and capacity. These data distribution patterns are called levels.
In general, web services, standards, and more uniform architectures, all of which are characteristic of clouds, make integration easier. On the other hand, platform and software as cloud services can make integration more difficult because they sometimes hide the some of the interior workings of the service from the integrator.
Agile and DevOps are two methodologies that are strong influences today. Agile is a development methodology that emphasizes iterative incremental development over methodologies that advocate complete detailed designs before writing code. DevOps is a methodology that combines well with Agile. DevOps emphasizes close cooperation between development and operations groups. Operations participates in the design, construction, and testing of a project. When Agile and DevOps combine, code is delivered to operations teams in small increments. The experience of the operations team is then incorporated into the next drop of code.
- The Imperative
- Sequence number
- Chapter number
- Chapter 1