Loading genres...
Explore the best books about Computers genre.





A study of the evolution of the modern computer profiles the work of MIT psychologist J. C. R. Licklider, whose visionary dream of a human-computer symbiosis transformed the course of modern science and led to the development of the personal computer. Reprint.

Structure and Interpretation of Computer Programs has had a dramatic impact on computer science curricula over the past decade. This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

The computer revolution brought with it new methods of getting work done—just look at today's news for reports of hard-driven, highly-motivated young software and online commerce developers who sacrifice evenings and weekends to meet impossible deadlines. Tracy Kidder got a preview of this world in the late 1970s when he observed the engineers of Data General design and build a new 32-bit minicomputer in just one year. His thoughtful, prescient book, The Soul of a New Machine, tells stories of 35-year-old "veteran" engineers hiring recent college graduates and encouraging them to work harder and faster on complex and difficult projects, exploiting the youngsters' ignorance of normal scheduling processes while engendering a new kind of work ethic.These days, we are used to the "total commitment" philosophy of managing technical creation, but Kidder was surprised and even a little alarmed at the obsessions and compulsions he found. From in-house political struggles to workers being permitted to tease management to marathon 24-hour work sessions, The Soul of a New Machine explores concepts that already seem familiar, even old-hat, less than 20 years later. Kidder plainly admires his subjects; while he admits to hopeless confusion about their work, he finds their dedication heroic. The reader wonders, though, what will become of it all, now and in the future. —Rob Lightner

In as little as a decade, artificial intelligence could match, then surpass human intelligence. Corporations & government agencies around the world are pouring billions into achieving AI’s Holy Grail—human-level intelligence. Once AI has attained it, scientists argue, it will have survival drives much like our own. We may be forced to compete with a rival more cunning, more powerful & more alien than we can imagine. Thru profiles of tech visionaries, industry watchdogs & groundbreaking AI systems, James Barrat's Our Final Invention explores the perils of the heedless pursuit of advanced AI. Until now, human intelligence has had no rival. Can we coexist with beings whose intelligence dwarfs our own? Will they allow us to?

Comment un groupe de hackers, de génies et de geeks a créé la révolution numériqueL’auteur de la biographie magistrale de Steve Jobs parue quelques mois après la mort du patron d’Apple et aujourd’hui adaptée au cinéma nous livre un nouveau travail monumental et unique.Depuis le milieu du XIXe siècle et jusqu’à nos jours, Walter Isaacson dresse l’histoire des premières machines, de la naissance de l’ordinateur jusqu’à l’explosion de l’ère numérique. Une formidable saga qui met en scène des hommes et des femmes de génie, des intuitions spectaculaires, des aventures industrielles totalement hors du commun.De la vision d’Ada Lovelace – fille de Byron – qui fut la première à imaginer que les calculatrices deviendraient des ordinateurs multitâches, aux nouvelles réflexions vertigineuses des créateurs de Google, Walter Isaacson nous offre la plus stimulante galerie de portraits. L’aventure numérique n’est pas uniquement associée à l’inventivité de quelques génies – c’est une aventure collective faite de centaines d’étapes, chacune associée à la créativité d’une équipe. Comment fonctionnaient leurs esprits, quel terrain économique et social les a rendus si inventifs, qui furent les pionniers de la programmation, des circuits intégrés, d’internet, du web… quelles idées ont inspiré des personnalités aussi fascinantes, que John Von Neumann, Alan Turing, Robert Noyce, Steve Jobs, Larry Page…À partir d’un travail de recherche colossal, d’interviews avec les grands acteurs encore vivants, d’une réflexion puissante et originale sur cette révolution en marche, Walter Isaacson fait vivre avec force l’histoire de ces bâtisseurs d’un nouveau monde.

Straight from the programming trenches, The Pragmatic Programmer cuts through the increasing specialization and technicalities of modern software development to examine the core process--taking a requirement and producing working, maintainable code that delights its users. It covers topics ranging from personal responsibility and career development to architectural techniques for keeping your code flexible and easy to adapt and reuse. Read this book, and you'll learn how toFight software rot; Avoid the trap of duplicating knowledge; Write flexible, dynamic, and adaptable code; Avoid programming by coincidence; Bullet-proof your code with contracts, assertions, and exceptions; Capture real requirements; Test ruthlessly and effectively; Delight your users; Build teams of pragmatic programmers; and Make your developments more precise with automation. Written as a series of self-contained sections and filled with entertaining anecdotes, thoughtful examples, and interesting analogies, The Pragmatic Programmer illustrates the best practices and major pitfalls of many different aspects of software development. Whether you're a new coder, an experienced programmer, or a manager responsible for software projects, use these lessons daily, and you'll quickly see improvements in personal productivity, accuracy, and job satisfaction. You'll learn skills and develop habits and attitudes that form the foundation for long-term success in your career. You'll become a Pragmatic Programmer.

The world's most infamous hacker offers an insider's view of the low-tech threats to high-tech security Kevin Mitnick's exploits as a cyber-desperado and fugitive form one of the most exhaustive FBI manhunts in history and have spawned dozens of articles, books, films, and documentaries. Since his release from federal prison, in 1998, Mitnick has turned his life around and established himself as one of the most sought-after computer security experts worldwide. Now, in The Art of Deception, the world's most notorious hacker gives new meaning to the old adage, "It takes a thief to catch a thief." Focusing on the human factors involved with information security, Mitnick explains why all the firewalls and encryption protocols in the world will never be enough to stop a savvy grifter intent on rifling a corporate database or an irate employee determined to crash a system. With the help of many fascinating true stories of successful attacks on business and government, he illustrates just how susceptible even the most locked-down information systems are to a slick con artist impersonating an IRS agent. Narrating from the points of view of both the attacker and the victims, he explains why each attack was so successful and how it could have been prevented in an engaging and highly readable style reminiscent of a true-crime novel. And, perhaps most importantly, Mitnick offers advice for preventing these types of social engineering hacks through security protocols, training programs, and manuals that address the human element of security.

Demarco and Lister demonstrate that the major issues of software development are human, not technical. Their answers aren't easy--just incredibly successful. New second edition features eight all-new chapters. Softcover. Previous edition: c1987. DLC: Management.

If you pick your books by their popularity--how many and which other people are reading them--then know this about The Search: it's probably on Bill Gates' reading list, and that of almost every venture capitalist and startup-hungry entrepreneur in Silicon Valley. In its sweeping survey of the history of Internet search technologies, its gossip about and analysis of Google, and its speculation on the larger cultural implications of a Web-connected world, it will likely receive attention from a variety of businesspeople, technology futurists, journalists, and interested observers of mid-2000s zeitgeist. This ambitious book comes with a strong pedigree. Author John Battelle was a founder of The Industry Standard and then one of the original editors of Wired, two magazines which helped shape our early perceptions of the wild world of the Internet. Battelle clearly drew from his experience and contacts in writing The Search. In addition to the sure-handed historical perspective and easy familiarity with such dot-com stalwarts as AltaVista, Lycos, and Excite, he speckles his narrative with conversational asides from a cast of fascinating characters, such Google's founders, Larry Page and Sergey Brin; Yahoo's, Jerry Yang and David Filo; key executives at Microsoft and different VC firms on the famed Sandhill road; and numerous other insiders, particularly at the company which currently sits atop the search world, Google. The Search is not exactly the corporate history of Google. At the book's outset, Battelle specifically indicates his desire to understand what he calls the cultural anthropology of search, and to analyze search engines' current role as the "database of our intentions"--the repository of humanity's curiosity, exploration, and expressed desires. Interesting though that beginning is, though, Battelle's story really picks up speed when he starts dishing inside scoop on the darling business story of the decade, Google. To Battelle's credit, though, he doesn't stop just with historical retrospective: the final part of his book focuses on the potential future directions of Google and its products' development. In what Battelle himself acknowledges might just be a "digital fantasy train", he describes the possibility that Google will become the centralizing platform for our entire lives and quotes one early employee on the weightiness of Google's potential impact: "Sometimes I feel like I am on a bridge, twenty thousand feet up in the air. If I look down I'm afraid I'll fall. I don't feel like I can think about all the implications." Some will shrug at such words; after all, similar hype has accompanied other technologies and other companies before. Many others, though, will search Battelle's story for meaning--and fast. --Peter Han

This book is meant to help the reader learn how to program in C. It is the definitive reference guide, now in a second edition. Although the first edition was written in 1978, it continues to be a worldwide best-seller. This second edition brings the classic original up to date to include the ANSI standard. From the Preface: We have tried to retain the brevity of the first edition. C is not a big language, and it is not well served by a big book. We have improved the exposition of critical features, such as pointers, that are central to C programming. We have refined the original examples, and have added new examples in several chapters. For instance, the treatment of complicated declarations is augmented by programs that convert declarations into words and vice versa. As before, all examples have been tested directly from the text, which is in machine-readable form. As we said in the first preface to the first edition, C "wears well as one's experience with it grows." With a decade more experience, we still feel that way. We hope that this book will help you to learn C and use it well.

There are companies that create waves and those that ride or are drowned by them. This is a ride on the Google wave, and the fullest account of how it formed and crashed into traditional media businesses. With unprecedented access to Google's founders and executives, as well as to those in media who are struggling to keep their heads above water, Ken Auletta reveals how the industry is being disrupted and redefined.Auletta goes inside Google's closed-door meetings, introducing Google's notoriously private founders, Larry Page and Sergey Brin, as well as those who work with - and against - them. In Googled, the reader discovers the 'secret sauce' of the company's success and why the worlds of 'new' and 'old' media often communicate as if residents of different planets. It may send chills down traditionalists' spines, but it's a crucial roadmap to the future of media business: the Google story may well be the canary in the coal mine.Googled is candid, objective and authoritative. Crucially, it's not just a history or reportage: it's ahead of the curve and unlike any other Google books, which tend to have been near-histories, somewhat starstruck, now out of date or which fail to look at the full synthesis of business and technology.

The science behind global warming, and its history: how scientists learned to understand the atmosphere, to measure it, to trace its past, and to model its future.Global warming skeptics often fall back on the argument that the scientific case for global warming is all model predictions, nothing but simulation; they warn us that we need to wait for real data, “sound science.” In A Vast Machine Paul Edwards has news for these skeptics: without models, there are no data. Today, no collection of signals or observations—even from satellites, which can “see” the whole planet with a single instrument—becomes global in time and space without passing through a series of data models. Everything we know about the world's climate we know through models. Edwards offers an engaging and innovative history of how scientists learned to understand the atmosphere—to measure it, trace its past, and model its future.

“ Ruby on Rails™ Tutorial by Michael Hartl has become a must-read for developers learning how to build Rails apps.” — Peter Cooper, Editor of Ruby Inside Using Rails, developers can build web applications of exceptional elegance and power. Although its remarkable capabilities have made Ruby on Rails one of the world’s most popular web development frameworks, it can be challenging to learn and use. Ruby on Rails™ Tutorial, Second Edition, is the solution. Best-selling author and leading Rails developer Michael Hartl teaches Rails by guiding you through the development of your own complete sample application using the latest techniques in Rails web development. The updates to this edition include all-new site design using Twitter’s Bootstrap; coverage of the new asset pipeline, including Sprockets and Sass; behavior-driven development (BDD) with Capybara and RSpec; better automated testing with Guard and Spork; roll your own authentication with has_secure_password; and an introduction to Gherkin and Cucumber. You’ll find integrated tutorials not only for Rails, but also for the essential Ruby, HTML, CSS, JavaScript, and SQL skills you’ll need when developing web applications. Hartl explains how each new technique solves a real-world problem, and he demonstrates this with bite-sized code that’s simple enough to understand, yet novel enough to be useful. Whatever your previous web development experience, this book will guide you to true Rails mastery. This book will help you

Based on unprecedented access to the corporation’s archives, The Intel Trinity is the first full history of Intel Corporation—the essential company of the digital age— told through the lives of the three most important figures in the company’s history: Robert Noyce, Gordon Moore, and Andy Grove.Often hailed the “most important company in the world,” Intel remains, more than four decades after its inception, a defining company of the global digital economy. The legendary inventors of the microprocessor-the single most important product in the modern world-Intel today builds the tiny “engines” that power almost every intelligent electronic device on the planet.But the true story of Intel is the human story of the trio of geniuses behind it. Michael S. Malone reveals how each brought different things to Intel, and at different times. Noyce, the most respected high tech figure of his generation, brought credibility (and money) to the company’s founding; Moore made Intel the world’s technological leader; and Grove, has relentlessly driven the company to ever-higher levels of success and competitiveness. Without any one of these figures, Intel would never have achieved its historic success; with them, Intel made possible the personal computer, Internet, telecommunications, and the personal electronics revolutions.The Intel Trinity is not just the story of Intel’s legendary past; it also offers an analysis of the formidable challenges that lie ahead as the company struggles to maintain its dominance, its culture, and its legacy.With eight pages of black-and-white photos.

A thought-provoking and wide-ranging exploration of machine learning and the race to build computer intelligences as flexible as our ownIn the world's top research labs and universities, the race is on to invent the ultimate learning algorithm: one capable of discovering any knowledge from data, and doing anything we want, before we even ask. In The Master Algorithm , Pedro Domingos lifts the veil to give us a peek inside the learning machines that power Google, Amazon, and your smartphone. He assembles a blueprint for the future universal learner--the Master Algorithm--and discusses what it will mean for business, science, and society. If data-ism is today's philosophy, this book is its bible.

A Library Journal Best Book of the Year Tech-guru Brian McCullough delivers a rollicking history of the internet, why it exploded, and how it changed everything. The internet was never intended for you, opines Brian McCullough in this lively narrative of an era that utterly transformed everything we thought we knew about technology. In How the Internet Happened , he chronicles the whole fascinating story for the first time, beginning in a dusty Illinois basement in 1993, when a group of college kids set off a once-in-an-epoch revolution with what would become the first “dotcom.” Depicting the lives of now-famous innovators like Netscape’s Marc Andreessen and Facebook’s Mark Zuckerberg, McCullough also reveals surprising quirks and unknown tales as he tracks both the technology and the culture around the internet’s rise. Cinematic in detail and unprecedented in scope, the result both enlightens and informs as it draws back the curtain on the new rhythm of disruption and innovation the internet fostered, and helps to redefine an era that changed every part of our lives.

An inside look at modern open source software developers--and their applications to, and influence on, our online social world."Nadia is one of today's most nuanced thinkers about the depth and potential of online communities, and this book could not have come at a better time." --Devon Zuegel, director of product, communities at GitHubOpen source software--in which developers publish code that anyone can use--has long served as a bellwether for other online behavior. In the late 1990s, it provided an optimistic model for public

Widely considered one of the best practical guides to programming, Steve McConnell’s original code complete has been helping developers write better software for more than a decade. Now this classic book has been fully updated and revised with leading-edge practices—and hundreds of new code samples—illustrating the art and science of software construction. Capturing the body of knowledge available from research, academia, and everyday commercial practice, McConnell synthesizes the most effective techniques and must-know principles into clear, pragmatic guidance. No matter what your experience level, development environment, or project size, this book will inform and stimulate your thinking—and help you build the highest quality code.Discover the timeless techniques and strategies that help you:

Before the Internet became widely known as a global tool for terrorists, one perceptive U.S. citizen recognized its ominous potential. Armed with clear evidence of computer espionage, he began a highly personal quest to expose a hidden network of spies that threatened national security. But would the authorities back him up? Cliff Stoll's dramatic firsthand account is "a computer-age detective story, instantly fascinating [and] astonishingly gripping" (Smithsonian).Cliff Stoll was an astronomer turned systems manager at Lawrence Berkeley Lab when a 75-cent accounting error alerted him to the presence of an unauthorized user on his system. The hacker's code name was "Hunter" -- a mysterious invader who managed to break into U.S. computer systems and steal sensitive military and security information. Stoll began a one-man hunt of his spying on the spy. It was a dangerous game of deception, broken codes, satellites, and missile bases -- a one-man sting operation that finally gained the attention of the CIA...and ultimately trapped an international spy ring fueled by cash, cocaine, and the KGB.

There was a time, not too long ago, when the typewriter and notebook ruled, and the computer as an everyday tool was simply a vision. Revolution in the Valley traces this vision back to its earliest roots: the hallways and backrooms of Apple, where the groundbreaking Macintosh computer was born. The book traces the development of the Macintosh, from its inception as an underground skunkworks project in 1979 to its triumphant introduction in 1984 and beyond.The stories in Revolution in the Valley come on extremely good authority. That's because author Andy Hertzfeld was a core member of the team that built the Macintosh system software, and a key creator of the Mac's radically new user interface software. One of the chosen few who worked with the mercurial Steve Jobs, you might call him the ultimate insider.When Revolution in the Valley begins, Hertzfeld is working on Apple's first attempt at a low-cost, consumer-oriented computer: the Apple II. He sees that Steve Jobs is luring some of the company's most brilliant innovators to work on a tiny research effort the Macintosh. Hertzfeld manages to make his way onto the Macintosh research team, and the rest is history.Through lavish illustrations, period photos, and Hertzfeld's vivid first-hand accounts, Revolution in the Valley reveals what it was like to be there at the birth of the personal computer revolution. The story comes to life through the book's portrait of the talented and often eccentric characters who made up the Macintosh team. Now, over 20 years later, millions of people are benefiting from the technical achievements of this determined and brilliant group of people.

The Washington Post called this book "impressive" and "meticulously researched," with "much of the drama and suspense of a novel." The New York Times and USA Today found it "definitive." The Seattle Times said Gates "should be required reading for any new hire in the personal computer industry." Since its publication, Gates has been cited and used as a source by dozens of books and articles.Bill Gates is an American icon, the ultimate revenge of the nerd. The youngest self-made billionaire in history was for many years the most powerful person in the computer industry. His tantrums, his odd rocking tic, and his lavish philanthropy have become the stuff of legend. Gates is the one book that truly illuminates the early years of the man and his company.In high school he organized computer enterprises for profit. At Harvard he co-wrote Microsoft BASIC, the first commercial personal computer software, then dropped out and made it a global standard. At 25, he offered IBM a program he did not yet own--a program called DOS that would become the essential operating system for more than 100 million personal computers and the foundation of the Gates empire. As Microsoft's dominance extended around the globe, Bill Gates became idolized, hated, and feared.In this riveting independent biography, veteran computer journalists Stephen Manes and Paul Andrews draw on a dozen sessions with Gates himself and nearly a thousand hours of interviews with his friends, family, employees, and competitors to debunk the myths and paint the definitive picture of the real Bill Gates, "bugs" and all.Here is the shy but fearless competitor with the guts and brass to try anything once--on a computer, at a negotiation, or on water skis. Here is the cocky 23-year-old who calmly spurned an enormous buyout offer from Ross Perot. Here is the supersalesman who motivated his Smart Guys, fought bitter battles with giant IBM, and locked horns with Apple's Steve Jobs--and usually won.Here, too, is the workaholic pessimist who presided over Microsoft's meteoric rise while most other personal computer pioneers fell by the wayside. Gates extended his vision of software to art, entertainment, education, and even biotechnology, and made good on much of his promise to put his software "on every desk and in every home."Gates is a bracing, comprehensive portrait of the microcomputer industry, one of its leading companies, and the man who helped create a world where software is everything.

Someone once said that the task of a writer is to "make the familiar new and the new familiar". For years, Joel Spolsky has done exactly this at www.joelonsoftware.com. Now, for the first time, you can own a collection of the most important essays from his site in one book, with exclusive commentary and new insights from joel.

Most programmers' fear of user interface (UI) programming comes from their fear of doing UI design. They think that UI design is like graphic design―the mysterious process by which creative, latte-drinking, all-black-wearing people produce cool-looking, artistic pieces. Most programmers see themselves as analytic, logical thinkers instead―strong at reasoning, weak on artistic judgment, and incapable of doing UI design. In this brilliantly readable book, author Joel Spolsky proposes simple, logical rules that can be applied without any artistic talent to improve any user interface, from traditional GUI applications to websites to consumer electronics. Spolsky's primary axiom, the importance of bringing the program model in line with the user model, is both rational and simple. In a fun and entertaining way, Spolky makes user interface design easy for programmers to grasp. After reading User Interface Design for Programmers , you'll know how to design interfaces with the user in mind. You'll learn the important principles that underlie all good UI design, and you'll learn how to perform usability testing that works.

Starting in the 1980s, Lisp began to be used in several large systems, including Emacs, Autocad, and Interleaf. On Lisp explains the reasons behind Lisp's growing popularity as a mainstream programming language. On Lisp is a comprehensive study of advanced Lisp techniques, with bottom-up programming as the unifying theme. It gives the first complete description of macros and macro applications. The book also covers important subjects related to bottom-up programming, including functional programming, rapid prototyping, interactive development, and embedded languages. The final chapter takes a deeper look at object-oriented programming than previous Lisp books, showing the step-by-step construction of a working model of the Common Lisp Object System (CLOS). As well as an indispensable reference, On Lisp is a source of software. Its examples form a library of functions and macros that readers will be able to use in their own Lisp programs.

"The first edition of Programming Pearls was one of the most influential books I read early in my career, and many of the insights I first encountered in that book stayed with me long after I read it. Jon has done a wonderful job of updating the material. I am very impressed at how fresh the new examples seem." - Steve McConnell, author, Code CompleteWhen programmers list their favorite books, Jon Bentley's collection of programming pearls is commonly included among the classics. Just as natural pearls grow from grains of sand that irritate oysters, programming pearls have grown from real problems that have irritated real programmers. With origins beyond solid engineering, in the realm of insight and creativity, Bentley's pearls offer unique and clever solutions to those nagging problems. Illustrated by programs designed as much for fun as for instruction, the book is filled with lucid and witty descriptions of practical programming techniques and fundamental design principles. It is not at all surprising that Programming Pearls has been so highly valued by programmers at every level of experience. In this revision, the first in 14 years, Bentley has substantially updated his essays to reflect current programming methods and environments. In addition, there are three new essays on (1) testing, debugging, and timing; (2) set representations; and (3) string problems. All the original programs have been rewritten, and an equal amount of new code has been generated. Implementations of all the programs, in C or C++, are now available on the Web.What remains the same in this new edition is Bentley's focus on the hard core of programming problems and his delivery of workable solutions to those problems. Whether you are new to Bentley's classic or are revisiting his work for some fresh insight, this book is sure to make your own list of favorites.

The practice of building software is a “new kid on the block” technology. Though it may not seem this way for those who have been in the field for most of their careers, in the overall scheme of professions, software builders are relative “newbies.” In the short history of the software field, a lot of facts have been identified, and a lot of fallacies promulgated. Those facts and fallacies are what this book is about. There’s a problem with those facts–and, as you might imagine, those fallacies. Many of these fundamentally important facts are learned by a software engineer, but over the short lifespan of the software field, all too many of them have been forgotten. While reading Facts and Fallacies of Software Engineering , you may experience moments of “Oh, yes, I had forgotten that,” alongside some “Is that really true?” thoughts. The author of this book doesn’t shy away from controversy. In fact, each of the facts and fallacies is accompanied by a discussion of whatever controversy envelops it. You may find yourself agreeing with a lot of the facts and fallacies, yet emotionally disturbed by a few of them! Whether you agree or disagree, you will learn why the author has been called “the premier curmudgeon of software practice.” These facts and fallacies are fundamental to the software building field–forget or neglect them at your peril!

Paradigms of AI Programming is the first text to teach advanced Common Lisp techniques in the context of building major AI systems. By reconstructing authentic, complex AI programs using state-of-the-art Common Lisp, the book teaches students and professionals how to build and debug robust practical programs, while demonstrating superior programming style and important AI concepts. The author strongly emphasizes the practical performance issues involved in writing real working programs of significant size. Chapters on troubleshooting and efficiency are included, along with a discussion of the fundamentals of object-oriented programming and a description of the main CLOS functions. This volume is an excellent text for a course on AI programming, a useful supplement for general AI courses and an indispensable reference for the professional programmer.

The first edition, published in 1973, has become a classic reference in the field. Now with the second edition, readers will find information on key new topics such as neural networks and statistical pattern recognition, the theory of machine learning, and the theory of invariances. Also included are worked examples, comparisons between different methods, extensive graphics, expanded exercises and computer project topics.An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.

A fundamental software engineering project management guide based on the practical requirements of "Taming Wild Software Schedules". Emphasizes possible, realistic and "best practice" approaches for managers, technical leads and self-managed teams. The author emphasizes efficient development concepts with an examination of rapid development strategies and a study of classic mistakes, within the context of software-development fundamentals and risk management. Dissects the core issues of rapid development, lifecycle planning, estimation and scheduling. Contains very good and practical discussions of customer-oriented development, motivation and teamwork. Explains such fundamental requirements as team structure, feature-set control (the dreaded feature creep in every project), availability and use of productivity tools and project recovery options. Relevant case studies are analyzed and discussed within the context of specific software development problems. Over 200 pages in this publication are devoted to a summary of best practices, everything from the daily build and smoke test, through prototyping, model selection, measurement, reuse, and the top-10 risks list. This publication is definitely recommended and will become a classic in the field, just as the author's prior publication, "Code Complete" already is.

The SUPERMEN"After a rare speech at the National Center for Atmospheric Research in Boulder, Colorado, in 1976, programmers in the audience had suddenly fallen silent when Cray offered to answer questions. He stood there for several minutes, waiting for their queries, but none came. When he left, the head of NCAR's computing division chided the programmers. 'Why didn't someone raise a hand?' After a tense moment, one programmer replied, 'How do you talk to God?'" -from The SUPERMEN The Story of Seymour Cray and the Technical Wizards behind the Supercomputer"They were building revolutionary, not evolutionary, machines. . . . They were blazing a trail-molding science into a product. . . . The freedom to create was extraordinary." -from The SupermenIn 1951, a soft-spoken, skinny young man fresh from the University of Minnesota took a job in an old glider factory in St. Paul. Computer technology would never be the same, for the glider factory was the home of Engineering Research Associates and the recent college grad was Seymour R. Cray. During his extraordinary career, Cray would be alternately hailed as "the Albert Einstein," "the Thomas Edison," and "the Evel Knievel" of supercomputing. At various times, he was all three-a master craftsman, inventor, and visionary whose disdain for the rigors of corporate life became legendary, and whose achievements remain unsurpassed.The Supermen is award-winning writer Charles J. Murray's exhilarating account of how the brilliant-some would say eccentric-Cray and his gifted colleagues blazed the trail that led to the Information Age. This is a thrilling, real-life scientific adventure, deftly capturing the daring, seat-of-the-pants spirit of the early days of computer development, as well as an audacious, modern-day David and Goliath battle, in which a group of maverick engineers beat out IBM to become the runaway industry leaders.Murray's briskly paced narrative begins during the final months of the Second World War, when men such as William Norris and Howard Engstrom began researching commercial applications for the code-breaking machines of wartime, and charts the rise of technological research in response to the Cold War. In those days computers were huge, cumbersome machines with names like Demon and Atlas. When Cray came on board, things quickly changed.Drawing on in-depth interviews-including the last interview Cray completed before his untimely and tragic death-Murray provides rare insight into Cray's often controversial approach to his work. Cray could spend exhausting hours in single-minded pursuit of a particular goal, and Murray takes us behind the scenes to witness late-night brainstorming sessions and miraculous eleventh-hour fixes. Cray's casual, often hostile attitude toward management, although alienating to some, was more than a passionate need for independence; he simply thought differently than others. Seymour Cray saw farther and faster, and trusted his vision with an unassailable confidence. Yet he inspired great loyalty as well, making it possible for his own start-up company, Cray Research, to bring the 54,000-employee conglomerate of Control Data to its knees.Ultimately, The Supermen is a story of genius, and how a unique set of circumstances-a small-team approach, corporate detachment, and a government-backed marketplace-enabled that genius to flourish. In an atmosphere of unparalleled freedom and creativity, Seymour Cray's vision and drive fueled a technological revolution from which America would emerge as the world's leader in supercomputing.

Modern Operating Systems, Fourth Edition, is intended for introductory courses in Operating Systems in Computer Science, Computer Engineering, and Electrical Engineering programs.The widely anticipated revision of this worldwide best-seller incorporates the latest developments in operating systems (OS) technologies. The Fourth Edition includes up-to-date materials on relevant OS. Tanenbaum also provides information on current research based on his experience as an operating systems researcher.Modern Operating Systems, Third Edition was the recipient of the 2010 McGuffey Longevity Award. The McGuffey Longevity Award recognizes textbooks whose excellence has been demonstrated over time. http://taaonline.net/index.htmlTeaching and Learning ExperienceThis program will provide a better teaching and learning experience–for you and your students. It will help:* Provide Practical Detail on the Big Picture Concepts: A clear and entertaining writing style outlines the concepts every OS designer needs to master.* Keep Your Course Current: This edition includes information on the latest OS technologies and developments* Enhance Learning with Student and Instructor Resources: Students will gain hands-on experience using the simulation exercises and lab experiments.

Originally published in 1993, this is the story of Steve Jobs’s ambitious attempts after he left Apple in 1985 to create a new company, NeXT Computer. This period was the nadir of Jobs’s professional life, as NeXT’s products failed to find a welcome in the marketplace. The company burned through more than $250 million without managing to eke out a profit. It would eventually be rescued by Apple and Jobs would return there after the close of the book’s narrative. When he did, he took with him lessons learned during his NeXT years in how not to manage a company.

The CLOS metaobject protocol is an elegant, high-performance extension tothe CommonLisp Object System. The authors, who developed the metaobject protocol andwho were among the group that developed CLOS, introduce this new approach toprogramming language design, describe its evolution and design principles, andpresent a formal specification of a metaobject protocol for CLOS.Kiczales, desRivi?res, and Bobrow show that the "art of metaobject protocol design" lies increating a synthetic combination of object-oriented and reflective techniques thatcan be applied under existing software engineering considerations to yield a newapproach to programming language design that meets a broad set of designcriteria.One of the major benefits of including the metaobject protocol inprogramming languages is that it allows users to adjust the language to better suittheir needs. Metaobject protocols also disprove the adage that adding moreflexibility to a programming language reduces its performance. In presenting theprinciples of metaobject protocols, the authors work with actual code for asimplified implementation of CLOS and its metaobject protocol, providing anopportunity for the reader to gain hands-on experience with the design process. Theyalso include a number of exercises that address important concerns and openissues.Gregor Kiczales and Jim des Rivi?res, are Members of the Research Staff, andDaniel Bobrow is a Research Fellow, in the System Sciences Laboratory at Xerox PaloAlto Research Center.

Most people are baffled by how computers work and assume that they will never understand them. What they don't realize—and what Daniel Hillis's short book brilliantly demonstrates—is that computers' seemingly complex operations can be broken down into a few simple parts that perform the same simple procedures over and over again.Computer wizard Hillis offers an easy-to-follow explanation of how data is processed that makes the operations of a computer seem as straightforward as those of a bicycle. Avoiding technobabble or discussions of advanced hardware, the lucid explanations and colorful anecdotes in The Pattern on the Stone go straight to the heart of what computers really do.Hillis proceeds from an outline of basic logic to clear descriptions of programming languages, algorithms, and memory. He then takes readers in simple steps up to the most exciting developments in computing today—quantum computing, parallel computing, neural networks, and self-organizing systems.Written clearly and succinctly by one of the world's leading computer scientists, The Pattern on the Stone is an indispensable guide to understanding the workings of that most ubiquitous and important of machines: the computer.

When Pierre Omidyar launched a clunky website from a spare bedroom over Labor Day weekend of 1995, he wanted to see if he could use the Internet to create a perfect market. He never guessed his old-computer parts and Beanie Baby exchange would revolutionize the world of commerce.Now, Adam Cohen, the only journalist ever to get full access to the company, tells the remarkable story of eBay's rise. He describes how eBay built the most passionate community ever to form in cyberspace and forged a business that triumphed over larger, better-funded rivals. And he explores the ever-widening array of enlistees in the eBay revolution, from a stay-at-home mom who had to rent a warehouse for her thriving business selling bubble-wrap on eBay to the young MBA who started eBay Motors (which within months of its launch was on track to sell $1 billion in cars a year), to collectors nervously bidding thousands of dollars on antique clothing-irons.Adam Cohen's fascinating look inside eBay is essential reading for anyone trying to figure out what's next. If you want to truly understand the Internet economy, The Perfect Store is indispensable.

Man has within a single generation found himself sharing the world with a strange new the computers and computer-like machines. Neither history, nor philosophy, nor common sense will tell us how these machines will affect us, for they do not do "work" as did machines of the Industrial Revolution. Instead of dealing with materials or energy, we are told that they handle "control" and "information" and even "intellectual processes." There are very few individuals today who doubt that the computer and its relatives are developing rapidly in capability and complexity, and that these machines are destined to play important (though not as yet fully understood) roles in society's future. Though only some of us deal directly with computers, all of us are falling under the shadow of their ever-growing sphere of influence, and thus we all need to understand their capabilities and their limitations. It would indeed be reassuring to have a book that categorically and systematically described what all these machines can do and what they cannot do, giving sound theoretical or practical grounds for each judgment. However, although some books have purported to do this, it cannot be done for the following a) Computer-like devices are utterly unlike anything which science has ever considered---we still lack the tools necessary to fully analyze, synthesize, or even think about them; and b) The methods discovered so far are effective in certain areas, but are developing much too rapidly to allow a useful interpretation and interpolation of results. The abstract theory---as described in this book---tells us in no uncertain terms that the machines' potential range is enormous, and that its theoretical limitations are of the subtlest and most elusive sort. There is no reason to suppose machines have any limitations not shared by man.

Kurt Goouml;del's Incompleteness Theorems sent shivers through Vienna's intellectual circles and directly challenged Ludwig Wittgenstein's dominant philosophy. AlanTuring's mathematical genius helped him break the Nazi Enigma Code during WWII. Though they never met, their lives strangely mirrored one another-both were brilliant, and both met with tragic ends.Here, a mysterious narrator intertwines these parallel lives into a double helix of genius and anguish, wonderfully capturing not only two radiant, fragile minds but also the zeitgeist of theera. "From the Trade Paperback edition."

The magnificent, unrivaled history of codes and ciphers—how they're made, how they're broken, and the many and fascinating roles they've played since the dawn of civilization in war, business, diplomacy, and espionage—updated with a new chapter on computer cryptography and the Ultra secret.Man has created codes to keep secrets and has broken codes to learn those secrets since the time of the Pharaohs. For 4,000 years, fierce battles have been waged between codemakers and codebreakers, and the story of these battles is civilization's secret history, the hidden account of how wars were won and lost, diplomatic intrigues foiled, business secrets stolen, governments ruined, computers hacked. From the XYZ Affair to the Dreyfus Affair, from the Gallic War to the Persian Gulf, from Druidic runes and the kaballah to outer space, from the Zimmermann telegram to Enigma to the Manhattan Project, codebreaking has shaped the course of human events to an extent beyond any easy reckoning. Once a government monopoly, cryptology today touches everybody. It secures the Internet, keeps e-mail private, maintains the integrity of cash machine transactions, and scrambles TV signals on unpaid-for channels. David Kahn's The Codebreakers takes the measure of what codes and codebreaking have meant in human history in a single comprehensive account, astonishing in its scope and enthralling in its execution. Hailed upon first publication as a book likely to become the definitive work of its kind, The Codebreakers has more than lived up to that it remains unsurpassed. With a brilliant new chapter that makes use of previously classified documents to bring the book thoroughly up to date, and to explore the myriad ways computer codes and their hackers are changing all of our lives, The Codebreakers is the skeleton key to a thousand thrilling true stories of intrigue, mystery, and adventure. It is a masterpiece of the historian's art.

This is "the Word" -- one man's word, certainly -- about the art (and artifice) of the state of our computer-centric existence. And considering that the "one man" is Neal Stephenson, "the hacker Hemingway" (Newsweek) -- acclaimed novelist, pragmatist, seer, nerd-friendly philosopher, and nationally bestselling author of groundbreaking literary works (Snow Crash, Cryptonomicon, etc., etc.) -- the word is well worth hearing. Mostly well-reasoned examination and partial rant, Stephenson's In the Beginning... was the Command Line is a thoughtful, irreverent, hilarious treatise on the cyber-culture past and present; on operating system tyrannies and downloaded popular revolutions; on the Internet, Disney World, Big Bangs, not to mention the meaning of life itself.

Barely fifty years ago a computer was a gargantuan, vastly expensive thing that only a handful of scientists had ever seen. The world’s brightest engineers were stymied in their quest to make these machines small and affordable until the solution finally came from two ingenious young Americans. Jack Kilby and Robert Noyce hit upon the stunning discovery that would make possible the silicon microchip, a work that would ultimately earn Kilby the Nobel Prize for physics in 2000. In this completely revised and updated edition of The Chip , T.R. Reid tells the gripping adventure story of their invention and of its growth into a global information industry. This is the story of how the digital age began.

The first complete look at one of America's legendary business leadersThis groundbreaking biography by Kevin Maney, acclaimed technology columnist for USA Today, offers fresh insight and new information on one of the twentieth century's greatest business figures. Over the course of forty-two years, Thomas J. Watson took a failing business called The Computer-Tabulating-Recording Company and transformed it into IBM, the world's first and most famous high-tech company. The Maverick and His Machine is the first modern biography of this business titan. Maney secured exclusive access to hundreds of boxes of Watson's long-forgotten papers, and he has produced the only complete picture of Watson the man and Watson the legendary business leader. These uncovered documents reveal new information about how Watson bet the company in the 1920s on tabulating machines-the forerunners to computers-and how he daringly beat the Great Depression of the 1930s. The documents also lead to new insights concerning the controversy that has followed his suppos ed coll usion with Adolf Hitler's Nazi regime.Maney paints a vivid portrait of Watson, uncovers his motivations, and offers needed context on his mammoth role in the course of modern business history. Jim Collins, author of the bestsellers Good to Great and Built to Last, writes in the Foreword to Maney's "Leaders like Watson are like forces of nature-almost terrifying in their release of energy and unpredictable volatility, but underneath they still adhere to certain patterns and principles. The patterns and principles might be hard to see amidst the melee, but they are there nonetheless. It takes a gifted person of insight to highlight those patterns, and that is exactly what Kevin Maney does in this book."The Maverick and His Machine also includes never-before-published photos of Watson from IBM's archives, showing Watson in greater detail than any book ever has before. Essential reading for every businessperson, tech junkie, and IBM follower, the book is also full of the kind of personal detail and reconstructed events that make it a page-turning story for general readers. The Maverick and the Machine is poised to be one of the most important business biographies in years.Kevin Maney is a nationally syndicated, award-winning technology columnist at USA Today, where he has been since 1985. He is a cover story writer whose story about IBM's bet-the-company move gained him national recognition. He was voted best technology columnist by the business journalism publication TJFR. Marketing Computers magazine has four times named him one of the most influential technology columnists. He is the author of Wiley's MEGAMEDIA The Inside Story of the Leaders and the Losers in the Exploding Communications Industry, which was a Business Week Bestseller. Clifton, VA ."Watson was clearly a genius with a thousand helpers, yet he managed to build an institution that could transcend the genius."-from the Foreword by Jim Collins"Like all great biographers, Kevin Maney gives us an engaging story . . .his fascinating and definitive book about IBM's founder is replete with amazing revelations and character lessons that resonate today."-Rosabeth Moss Kanter, Harvard Business School, bestselling author of Evolve! and When Giants Learn to Dance

Computers have completely changed the way we teach children. We have Mindstorms to thank for that. In this book, pioneering computer scientist Seymour Papert uses the invention of LOGO, the first child-friendly programming language, to make the case for the value of teaching children with computers. Papert argues that children are more than capable of mastering computers, and that teaching computational processes like de-bugging in the classroom can change the way we learn everything else. He also shows that schools saturated with technology can actually improve socialization and interaction among students and between students and teachers.

Nelson writes passionately about the need for people to understand computers deeply, more deeply than was generally promoted as computer literacy, which he considers a superficial kind of familiarity with particular hardware and software. His rallying cry "Down with Cybercrud" is against the centralization of computers such as that performed by IBM at the time, as well as against what he sees as the intentional untruths that "computer people" tell to non-computer people to keep them from understanding computers. In Dream Machines, Nelson covers the flexible media potential of the computer, which was shockingly new at the time.

“This makes entertaining reading. Many accounts of the birth of personal computing have been written, but this is the first close look at the drug habits of the earliest pioneers.” —New York TimesMost histories of the personal computer industry focus on technology or business. John Markoff’s landmark book is about the culture and consciousness behind the first PCs—the culture being counter– and the consciousness expanded, sometimes chemically. It’s a brilliant evocation of Stanford, California, in the 1960s and ’70s, where a group of visionaries set out to turn computers into a means for freeing minds and information. In these pages one encounters Ken Kesey and the phone hacker Cap’n Crunch, est and LSD, The Whole Earth Catalog and the Homebrew Computer Lab. What the Dormouse Said is a poignant, funny, and inspiring book by one of the smartest technology writers around.

Discover why millions of computer users trust Dan Gookin to demystify DOS and communicate the essentials of computing. His clear explanations and down-to-earth style make exploring this operating system painless -- even fun! With complete coverage of all versions of DOS, as well as DOS under Windows 98, DOS For Dummies, 3rd Edition (the latest edition of this best-selling guide), offers frustrated DOS users more help than ever before. Inside, find helpful advice on how to Graps the facts and features of MS-DOS in Windows 98 Share data between DOS programs and Windows -- easily Find that lost file and retrieve your program if it crashes Handle error messages in DOS -- without panicking Work with all versions of DOS Discover more about modems and DOS utilities Understand complex terminology with clear explanations in a glossary of terms Run programs directly from the DOS prompt

In Go To , Steve Lohr chronicles the history of software from the early days of complex mathematical codes mastered by a few thousand to today's era of user-friendly software and over six million professional programmers worldwide. Lohr maps out the unique seductions of programming, and gives us an intimate portrait of the peculiar kind of genius that is drawn to this blend of art, science, and engineering, introducing us to the movers and shakers of the 1950s and the open-source movement of today. With original reporting and deft storytelling, Steve Lohr shows us how software transformed the world, and what it holds in store for our future.

Still the best book on the Internet. The Whole Internet User's Guide & Catalog, 2nd Edition is a comprehensive introduction to the international network of computer systems called the Internet, a resource of almost unimaginable wealth.As a complete introduction to the Internet, this book covers the basic utilities you use to access the mail, telnet, ftp, and news readers. But it also does much more. The Guide pays close attention to several important information servers (Archie, Wais, Gopher) that are, essentially, databases of they help you find what you want among the millions of files and thousands of archives available. There's also coverage of the World Wide Web. We've also included our own database of a resource index that covers a broad selection of several hundred important resources, ranging from the King James Bible to archives for USENET news.So if you use the Internet for work or for pleasure -- or if you'd like to, but don't know how -- you need this book. If you've been around the Net for a few years, you'll still be able to discover resources you didn't know existed. Also includes a pull-out quick-reference card.Now more comprehensive than ever, here's what you will find in the second

Today, Microsoft commands the high ground of the information superhighway by owning the operating systems and basic applications programs that run on the world's 170 million computers. Beyond the unquestioned genius and vision of Bill Gates, what accounts for Microsofts astounding success?Drawing on almost two years of on-site observation at Microsoft headquarters, eminent scientists Michael A. Cusumano and Richard W. Selby reveal many of Microsoft's innermost secrets. This inside report, based on forty in-depth interviews by authors who had access to confidential documents and project data, outlines the seven complementary strategies that characterize exactly how Microsoft competes and operates, including the "Brain Trust" of talented employees and exceptional management; "bang for the buck" competitive strategies and clear organizational goals that produce self-critiquing, learning, and improving; a flexible, incremental approach to product development; and a relentless pursuit of future markets.Cusumano and Selby's masterful analysis successfully uncovers the distinctive way in which Microsoft has combined all of the elements necessary to get to the top of an enormously important industry -- and stay there.

The definitive guide to the cloud computing revolution.Hailed as "the most influential book so far on the cloud computing movement" (Christian Science Monitor), The Big Switch makes a simple and profound statement: Computing is turning into a utility, and the effects of this transition will ultimately change society as completely as the advent of cheap electricity did. In a new chapter for this edition that brings the story up-to-date, Nicholas Carr revisits the dramatic new world being conjured from the circuits of the "World Wide Computer."

Kaplan, a well-known figure in the computer industry, founded GO Corporation in 1987, and for several years it was one of the hottest new ventures in the Valley. Startup tells the story of Kaplan's wild ride: how he assembled a brilliant but fractious team of engineers, software designers, and investors; pioneered the emerging market for hand-held computers operated with a pen instead of a keyboard; and careened from crisis to crisis without ever losing his passion for a revolutionary idea. Along the way, Kaplan vividly recreates his encounters with eccentric employees, risk-addicted venture capitalists, and industry giants such as Bill Gates, John Sculley, and Mitchell Kapor. And no one - including Kaplan himself - is spared his sharp wit and observant eye.

Describes how to use the text-preparation system to create documents, covering such topics as inputting text, symbols, and mathematics; how to include graphics; using LATEX with HTML and XML; and PDF outputing.

Twenty five years ago, it didn't exist. Today, twenty million people worldwide are surfing the Net. Where Wizards Stay Up Late is the exciting story of the pioneers responsible for creating the most talked about, most influential, and most far-reaching communications breakthrough since the invention of the telephone. In the 1960's, when computers where regarded as mere giant calculators, J.C.R. Licklider at MIT saw them as the ultimate communications devices. With Defense Department funds, he and a band of visionary computer whizzes began work on a nationwide, interlocking network of computers. Taking readers behind the scenes, Where Wizards Stay Up Late captures the hard work, genius, and happy accidents of their daring, stunningly successful venture.

Examines the development of the Macintosh computer, explains how the Apple Computer company is managed, and discusses the computer software industry

Ray Kurzweil is the inventor of the most innovative and compelling technology of our era, an international authority on artificial intelligence, and one of our greatest living visionaries. Now he offers a framework for envisioning the twenty-first century--an age in which the marriage of human sensitivity and artificial intelligence fundamentally alters and improves the way we live. Kurzweil's prophetic blueprint for the future takes us through the advances that inexorably result in computers exceeding the memory capacity and computational ability of the human brain by the year 2020 (with human-level capabilities not far behind); in relationships with automated personalities who will be our teachers, companions, and lovers; and in information fed straight into our brains along direct neural pathways. Optimistic and challenging, thought-provoking and engaging, The Age of Spiritual Machines is the ultimate guide on our road into the next century.

When, in 1984–86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

The LISP language is designed primarily for symbolic data processing used for symbolic calculations in differential and integral calculus, electrical circuit theory, mathematical logic, game playing, and other fields of artificial intelligence.The manual describes LISP, a formal mathematical language. LISP differs from most programming languages in three important ways. The first way is in the nature of the data. In the LISP language, all data are in the form of symbolic expressions usually referred to as S-expressions, of indefinite length, and which have a branching tree-type of structure, so that significant subexpressions can be readily isolated. In the LISP system, the bulk of the available memory is used for storing S-expressions in the form of list structures. The second distinction is that the LISP language is the source language itself which specifies in what way the S-expressions are to be processed. Third, LISP can interpret and execute programs written in the form of S-expressions. Thus, like machine language, and unlike most other high level languages, it can be used to generate programs for further executions.


Take the guesswork out of using regular expressions. With more than 140 practical recipes, this cookbook provides everything you need to solve a wide range of real-world problems. Novices will learn basic skills and tools, and programmers and experienced users will find a wealth of detail. Each recipe provides samples you can use right away. This revised edition covers the regular expression flavors used by C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET. You’ll learn powerful new tricks, avoid flavor-specific gotchas, and save valuable time with this huge library of practical solutions.

The true story of Max Butler, the master hacker who ran a billion dollar cyber crime network.The word spread through the hacking underground like some unstoppable new virus: an audacious crook had staged a hostile takeover of an online criminal network that siphoned billions of dollars from the US economy.The culprit was a brilliant programmer with a hippie ethic and a supervillain's double identity. Max 'Vision' Butler was a white-hat hacker and a celebrity throughout the programming world, even serving as a consultant to the FBI. But there was another side to Max. As the black-hat 'Iceman', he'd seen the fraudsters around him squabble, their ranks riddled with infiltrators, their methods inefficient, and in their dysfunction was the ultimate challenge: he would stage a coup and steal their ill-gotten gains from right under their noses.Through the story of Max Butler's remarkable rise, KINGPIN lays bare the workings of a silent crime wave affecting millions worldwide. It exposes vast online-fraud supermarkets stocked with credit card numbers, counterfeit cheques, hacked bank accounts and fake passports. Thanks to Kevin Poulsen's remarkable access to both cops and criminals, we step inside the quiet,desperate battle that law enforcement fights against these scammers. And learn that the boy next door may not be all he seems.

If they were a hall of fame or shame for computer hackers, a Kevin Mitnick plaque would be mounted the near the entrance. While other nerds were fumbling with password possibilities, this adept break-artist was penetrating the digital secrets of Sun Microsystems, Digital Equipment Corporation, Nokia, Motorola, Pacific Bell, and other mammoth enterprises. His Ghost in the Wires memoir paints an action portrait of a plucky loner motivated by a passion for trickery, not material game. (P.S. Mitnick's capers have already been the subject of two books and a movie. This first-person account is the most comprehensive to date.)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same. Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars. Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

Written by noted quantum computing theorist Scott Aaronson, this book takes readers on a tour through some of the deepest ideas of maths, computer science and physics. Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.

Your no-nonsense guide to making sense of machine learning Machine learning can be a mind-boggling concept for the masses, but those who are in the trenches of computer programming know just how invaluable it is. Without machine learning, fraud detection, web search results, real-time ads on web pages, credit scoring, automation, and email spam filtering wouldn't be possible, and this is only showcasing just a few of its capabilities. Written by two data science experts, Machine Learning For Dummies offers a much-needed entry point for anyone looking to use machine learning to accomplish practical tasks. Covering the entry-level topics needed to get you familiar with the basic concepts of machine learning, this guide quickly helps you make sense of the programming languages and tools you need to turn machine learning-based tasks into a reality. Whether you're maddened by the math behind machine learning, apprehensive about AI, perplexed by preprocessing data―or anything in between―this guide makes it easier to understand and implement machine learning seamlessly. Dive into this complete beginner's guide so you are armed with all you need to know about machine learning!

It was early 1993 and id Software was at the top of the PC gaming industry. Wolfenstein 3D had established the First Person Shooter genre and sales of its sequel Spear of Destiny were skyrocketing. The technology and tools id had taken years to develop were no match for their many competitors.It would have been easy for id to coast on their success, but instead they made the audacious decision to throw away everything they had built and start from scratch. Game Engine Black Doom is the story of how they did it.This is a book about history and engineering. Don’t expect much prose (the author’s English has improved since the first book but is still broken). Instead you will find inside extensive descriptions and drawings to better understand all the challenges id Software had to overcome. From the hardware -- the Intel 486 CPU, the Motorola 68040 CPU, and the NeXT workstations -- to the game engine’s revolutionary design, open up to learn how DOOM changed the gaming industry and became a legend among video games.

A sweeping examination of the current state of artificial intelligence and how it is remaking our worldNo recent scientific enterprise has proved as alluring, terrifying, and filled with extravagant promise and frustrating setbacks as artificial intelligence. The award-winning author Melanie Mitchell, a leading computer scientist, now reveals AI’s turbulent history and the recent spate of apparent successes, grand hopes, and emerging fears surrounding it.In Artificial Intelligence, Mitchell turns to the most urgent questions concerning AI today: How intelligent—really—are the best AI programs? How do they work? What can they actually do, and when do they fail? How humanlike do we expect them to become, and how soon do we need to worry about them surpassing us? Along the way, she introduces the dominant models of modern AI and machine learning, describing cutting-edge AI programs, their human inventors, and the historical lines of thought underpinning recent achievements. She meets with fellow experts such as Douglas Hofstadter, the cognitive scientist and Pulitzer Prize–winning author of the modern classic Gödel, Escher, Bach, who explains why he is “terrified” about the future of AI. She explores the profound disconnect between the hype and the actual achievements in AI, providing a clear sense of what the field has accomplished and how much further it has to go.Interweaving stories about the science of AI and the people behind it, Artificial Intelligence brims with clear-sighted, captivating, and accessible accounts of the most interesting and provocative modern work in the field, flavored with Mitchell’s humor and personal observations. This frank, lively book is an indispensable guide to understanding today’s AI, its quest for “human-level” intelligence, and its impact on the future for us all.

Two leaders in the field offer a compelling analysis of the current state of the art and reveal the steps we must take to achieve a truly robust AI. Despite the hype surrounding AI, creating an intelligence that rivals or exceeds human levels is far more complicated than we are led to believe. Professors Gary Marcus and Ernest Davis have spent their careers at the forefront of AI research and have witnessed some of the greatest milestones in the field, but they argue that a computer winning in games like Jeopardy and go does not signal that we are on the doorstep of fully autonomous cars or superintelligent machines. The achievements in the field thus far have occurred in closed systems with fixed sets of rules. These approaches are too narrow to achieve genuine intelligence. The world we live in is wildly complex and open-ended. How can we bridge this gap? What will the consequences be when we do? Marcus and Davis show us what we need to first accomplish before we get there and argue that if we are wise along the way, we won't need to worry about a future of machine overlords. If we heed their advice, humanity can create an AI that we can trust in our homes, our cars, and our doctor's offices. Reboot provides a lucid, clear-eyed assessment of the current science and offers an inspiring vision of what we can achieve and how AI can make our lives better.

The only contemporary history of the birth of Silicon Valley, from the reporter who had a ringside seat to it all.Over the past five decades, the tech industry has grown into one of the most important sectors of the global economy. Silicon Valley―replete with sprawling office parks, sky-high rents, and countless self-made millionaires―is home to many of its key players. But the origins of Silicon Valley and the tech sector are much humbler. At a time when tech companies’ influence continues to grow, The Big Score chronicles how they began.One of the first reporters on the tech industry beat at the San Jose Mercury-News, Michael S. Malone recounts the feverish efforts of young technologists and entrepreneurs to build something that would change the world―and score them a big payday. Starting with the birth of Hewlett-Packard in the 1930s, Malone illustrates how decades of technological innovation laid the foundation for the meteoric rise of the Valley in the 1970s. Drawing on exclusive, unvarnished interviews, Malone punctuates this history with incisive profiles of tech’s early luminaries―including Nobelist William Shockley and Apple’s Steve Jobs―when they were struggling entrepreneurs working 18-hour days in their garages. And he plunges us into the darker side of the Valley, where espionage, drugs, hellish working conditions, and shocking betrayals shaped the paths for winners and losers in a booming industry.A decades-long story with individual sacrifice, ingenuity, and big money at its core, The Big Score recounts the history of today’s most dynamic sector through its upstart beginnings.