<p>For the past decade or more, much of cell biology research has been focused on determining the key molecules involved in different cellular processes, an analytical problem that has been amenable to biochemical and genetic approaches. Now, we face an integrative problem of understanding how all of these molecules work together to produce living cells, a challenge that requires using quantitative approaches to model the complex interactions within a cell, and testing those models with careful quantitative measurements.</p> <p>This book is an introductory overview of the various approaches, methods, techniques, and models employed in quantitative cell biology, which are reviewed in greater detail in the other volumes in this e-book series. Particular emphasis is placed on the goals and purpose of quantitative analysis and modeling, and the special challenges that cell biology holds for understanding life at the physical level.</p>
<p>Intelligent systems often depend on data provided by information agents, for example, sensor data or crowdsourced human computation. Providing accurate and relevant data requires costly effort that agents may not always be willing to provide. Thus, it becomes important not only to verify the correctness of data, but also to provide incentives so that agents that provide high-quality data are rewarded while those that do not are discouraged by low rewards.</p> <p>We cover different settings and the assumptions they admit, including sensing, human computation, peer grading, reviews, and predictions. We survey different incentive mechanisms, including proper scoring rules, prediction markets and peer prediction, Bayesian Truth Serum, Peer Truth Serum, Correlated Agreement, and the settings where each of them would be suitable. As an alternative, we also consider reputation mechanisms. We complement the game-theoretic analysis with practical examples of applications in prediction platforms, community sensing, and peer grading.</p>
<p>This book provides computer engineers, academic researchers, new graduate students, and seasoned practitioners an end-to-end overview of virtual memory. We begin with a recap of foundational concepts and discuss not only state-of-the-art virtual memory hardware and software support available today, but also emerging research trends in this space. The span of topics covers processor microarchitecture, memory systems, operating system design, and memory allocation. We show how efficient virtual memory implementations hinge on careful hardware and software cooperation, and we discuss new research directions aimed at addressing emerging problems in this space.</p> <p>Virtual memory is a classic computer science abstraction and one of the pillars of the computing revolution. It has long enabled hardware flexibility, software portability, and overall better security, to name just a few of its powerful benefits. Nearly all user-level programs today take for granted that they will have been freed from the burden of physical memory management by the hardware, the operating system, device drivers, and system libraries.</p> <p>However, despite its ubiquity in systems ranging from warehouse-scale datacenters to embedded Internet of Things (IoT) devices, the overheads of virtual memory are becoming a critical performance bottleneck today. Virtual memory architectures designed for individual CPUs or even individual cores are in many cases struggling to scale up and scale out to today's systems which now increasingly include exotic hardware accelerators (such as GPUs, FPGAs, or DSPs) and emerging memory technologies (such as non-volatile memory), and which run increasingly intensive workloads (such as virtualized and/or "big data" applications). As such, many of the fundamental abstractions and implementation approaches for virtual memory are being augmented, extended, or entirely rebuilt in order to ensure that virtual memory remains viable and performant in the years to come.</p>
<p>This book provides fundamental principles, design procedures, and design tools for unmanned aerial vehicles (UAVs) with three sections focusing on vehicle design, autopilot design, and ground system design. The design of manned aircraft and the design of UAVs have some similarities and some differences. They include the design process, constraints (e.g., g-load, pressurization), and UAV main components (autopilot, ground station, communication, sensors, and payload). A UAV designer must be aware of the latest UAV developments; current technologies; know lessons learned from past failures; and they should appreciate the breadth of UAV design options.</p> <p>The contribution of unmanned aircraft continues to expand every day and over 20 countries are developing and employing UAVs for both military and scientific purposes. A UAV system is much more than a reusable air vehicle or vehicles. UAVs are air vehicles, they fly like airplanes and operate in an airplane environment. They are designed like air vehicles; they have to meet flight critical air vehicle requirements. A designer needs to know how to integrate complex, multi-disciplinary systems, and to understand the environment, the requirements and the design challenges and this book is an excellent overview of the fundamentals from an engineering perspective.</p> <p>This book is meant to meet the needs of newcomers into the world of UAVs. The materials are intended to provide enough information in each area and illustrate how they all play together to support the design of a complete UAV. Therefore, this book can be used both as a reference for engineers entering the field or as a supplementary text for a UAV design course to provide system-level context for each specialized topic.</p>
Both of the authors of these volumes are on a mission. In this book they present many issues which might relate to this context and ask the reader to question whether it is possible to define a 'good', 'right' or 'just' engineer. In order to do this the authors need theoretical underpinning as well as practical realities. They look historically at what engineering has been and consider what it could be/ should be/ what is its function? They consider the notion of 'engineering' in a society and what that implies. This volume allows readers to walk through some of the relevant concepts from disciplines which seem very far from engineering, but which are critical to draw from if we are to prepare engineers to contribute to an increasingly just society. The authors draw on different aspects of philosophy, psychology, economics, development studies, politics, history and sociology in our aim to understand our social context. They freely admit their positions and ask readers to critique them as well as the other authors discussed. They do not ask readers to accept their way of looking at the world, but to question it, to question others, and to question their own views - then to make up their minds and act on their beliefs.
Game theory is the mathematical study of interaction among independent, self-interested agents. The audience for game theory has grown dramatically in recent years, and now spans disciplines as diverse as political science, biology, psychology, economics, linguistics, sociology, and computer science, among others. What has been missing is a relatively short introduction to the field covering the common basis that anyone with a professional interest in game theory is likely to require. Such a text would minimize notation, ruthlessly focus on essentials, and yet not sacrifice rigor. This Synthesis Lecture aims to fill this gap by providing a concise and accessible introduction to the field. It covers the main classes of games, their representations, and the main concepts used to analyze them. Table of Contents: Games in Normal Form / Analyzing Games: From Optimality to Equilibrium / Further Solution Concepts for Normal-Form Games / Games with Sequential Actions: The Perfect-information Extensive Form / Generalizing the Extensive Form: Imperfect-Information Games / Repeated and Stochastic Games / Uncertainty about Payoffs: Bayesian Games / Coalitional Game Theory / History and References / Index
Geometric Programming is used for cost minimization, profit maximization, obtaining cost ratios, and the development of generalized design equations for the primal variables. The early pioneers of geometric programming-Zener, Duffin, Peterson, Beightler, Wilde, and Phillips-played important roles in its development. Five new case studies have been added to the third edition. There are five major sections: (1) Introduction, History and Theoretical Fundamentals; (2) Cost Minimization Applications with Zero Degrees of Difficulty; (3) Profit Maximization Applications with Zero Degrees of Difficulty; (4) Applications with Positive Degrees of Difficulty; and (5) Summary, Future Directions, and Geometric Programming Theses & Dissertations Titles. The various solution techniques presented are the constrained derivative approach, condensation of terms approach, dimensional analysis approach, and transformed dual approach. A primary goal of this work is to have readers develop more case studies and new solution techniques to further the application of geometric programming.
In his In the blink of an eye, Walter Murch, the Oscar-awarded editor of The English Patient, Apocalypse Now, and many other outstanding movies, devises the Rule of Six -- six criteria for what makes a good cut. On top of his list is "to be true to the emotion of the moment," a quality more important than advancing the story or being rhythmically interesting. The cut has to deliver a meaningful, compelling, and emotion-rich "experience" to the audience. Because, "what they finally remember is not the editing, not the camerawork, not the performances, not even the story---it's how they felt." Technology for all the right reasons applies this insight to the design of interactive products and technologies -- the domain of Human-Computer Interaction, Usability Engineering, and Interaction Design. It takes an experiential approach, putting experience before functionality and leaving behind oversimplified calls for ease, efficiency, and automation or shallow beautification. Instead, it explores what really matters to humans and what it needs to make technology more meaningful. The book clarifies what experience is, and highlights five crucial aspects and their implications for the design of interactive products. It provides reasons why we should bother with an experiential approach, and presents a detailed working model of experience useful for practitioners and academics alike. It closes with the particular challenges of an experiential approach for design. The book presents its view as a comprehensive, yet entertaining blend of scientific findings, design examples, and personal anecdotes. Table of Contents: Follow me / Crucial Properties of Experience / Three Good Reasons to Consider Experience / A Model of Experience / Reflections on Experience Design
In these notes, we introduce particle filtering as a recursive importance sampling method that approximates the minimum-mean-square-error (MMSE) estimate of a sequence of hidden state vectors in scenarios where the joint probability distribution of the states and the observations is non-Gaussian and, therefore, closed-form analytical expressions for the MMSE estimate are generally unavailable. We begin the notes with a review of Bayesian approaches to static (i.e., time-invariant) parameter estimation. In the sequel, we describe the solution to the problem of sequential state estimation in linear, Gaussian dynamic models, which corresponds to the well-known Kalman (or Kalman-Bucy) filter. Finally, we move to the general nonlinear, non-Gaussian stochastic filtering problem and present particle filtering as a sequential Monte Carlo approach to solve that problem in a statistically optimal way. We review several techniques to improve the performance of particle filters, including importance function optimization, particle resampling, Markov Chain Monte Carlo move steps, auxiliary particle filtering, and regularized particle filtering. We also discuss Rao-Blackwellized particle filtering as a technique that is particularly well-suited for many relevant applications such as fault detection and inertial navigation. Finally, we conclude the notes with a discussion on the emerging topic of distributed particle filtering using multiple processors located at remote nodes in a sensor network. Throughout the notes, we often assume a more general framework than in most introductory textbooks by allowing either the observation model or the hidden state dynamic model to include unknown parameters. In a fully Bayesian fashion, we treat those unknown parameters also as random variables. Using suitable dynamic conjugate priors, that approach can be applied then to perform joint state and parameter estimation.
Of Clocks and Time takes readers on a five-stop journey through the physics and technology (and occasional bits of applications and history) of timekeeping. On the way, conceptual vistas and qualitative images abound, but since mathematics is spoken everywhere the book visits equations, quantitative relations, and rigorous definitions are offered as well. The expedition begins with a discussion of the rhythms produced by the daily and annual motion of sun, moon, planets, and stars. Centuries worth of observation and thinking culminate in Newton's penetrating theoretical insights since his notion of space and time are still influential today. During the following two legs of the trip, tools are being examined that allow us to measure hours and minutes and then, with ever growing precision, the tiniest fractions of a second. When the pace of travel approaches the ultimate speed limit, the speed of light, time and space exhibit strange and counter-intuitive traits. On this fourth stage of the journey, Einstein is the local tour guide whose special and general theories of relativity explain the behavior of clocks under these circumstances. Finally, the last part of the voyage reverses direction, moving ever deeper into the past to explore how we can tell the age of "things" - including that of the universe itself.
Sentiment analysis and opinion mining is the field of study that analyzes people's opinions, sentiments, evaluations, attitudes, and emotions from written language. It is one of the most active research areas in natural language processing and is also widely studied in data mining, Web mining, and text mining. In fact, this research has spread outside of computer science to the management sciences and social sciences due to its importance to business and society as a whole. The growing importance of sentiment analysis coincides with the growth of social media such as reviews, forum discussions, blogs, micro-blogs, Twitter, and social networks. For the first time in human history, we now have a huge volume of opinionated data recorded in digital form for analysis. Sentiment analysis systems are being applied in almost every business and social domain because opinions are central to almost all human activities and are key influencers of our behaviors. Our beliefs and perceptions of reality, and the choices we make, are largely conditioned on how others see and evaluate the world. For this reason, when we need to make a decision we often seek out the opinions of others. This is true not only for individuals but also for organizations. This book is a comprehensive introductory and survey text. It covers all important topics and the latest developments in the field with over 400 references. It is suitable for students, researchers and practitioners who are interested in social media analysis in general and sentiment analysis in particular. Lecturers can readily use it in class for courses on natural language processing, social media analysis, text mining, and data mining. Lecture slides are also available online.
The goal of this book is to present an overview of the current state-of-the-art in computer architecture performance evaluation. The book covers various aspects that relate to performance evaluation, ranging from performance metrics, to workload selection, to various modeling approaches such as analytical modeling and simulation. And because simulation is by far the most prevalent modeling technique in computer architecture evaluation, the book spends more than half its content on simulation, covering an overview of the various simulation techniques in the computer designer's toolbox, followed by various simulation acceleration techniques such as sampled simulation, statistical simulation, and parallel and hardware-accelerated simulation. The evaluation methods described in this book have a primary focus on performance. Although performance remains to be a key design target, it no longer is the sole design target. Power consumption and reliability have quickly become primary design concerns, and today they probably are as important as performance. Other important design constraints relate to cost, thermal issues, yield, etc. This book focuses on performance evaluation methods only. This does not compromise on the importance and general applicability of the techniques described in this book because power and reliability models are typically integrated into existing performance models. These integrated models pose similar challenges to the ones handled in this book. The book also focuses on presenting fundamental concepts and ideas. The book does not provide much quantitative data. Although quantitative data is crucial to performance evaluation, to understand the fundamentals of performance evaluation methods it is not. Moreover, quantitative data from different sources may be hard to compare, and may even be misleading, because the contexts in which the results were obtained may be very different - a comparison based on these numbe
The last few years have witnessed fast development on dictionary learning approaches for a set of visual computing tasks, largely due to their utilization in developing new techniques based on sparse representation. Compared with conventional techniques employing manually defined dictionaries, such as Fourier Transform and Wavelet Transform, dictionary learning aims at obtaining a dictionary adaptively from the data so as to support optimal sparse representation of the data. In contrast to conventional clustering algorithms like K-means, where a data point is associated with only one cluster center, in a dictionary-based representation, a data point can be associated with a small set of dictionary atoms. Thus, dictionary learning provides a more flexible representation of data and may have the potential to capture more relevant features from the original feature space of the data. One of the early algorithms for dictionary learning is K-SVD. In recent years, many variations/extensions of K-SVD and other new algorithms have been proposed, with some aiming at adding discriminative capability to the dictionary, and some attempting to model the relationship of multiple dictionaries. One prominent application of dictionary learning is in the general field of visual computing, where long-standing challenges have seen promising new solutions based on sparse representation with learned dictionaries. With a timely review of recent advances of dictionary learning in visual computing, covering the most recent literature with an emphasis on papers after 2008, this book provides a systematic presentation of the general methodologies, specific algorithms, and examples of applications for those who wish to have a quick start on this subject.
The new field of cryptographic currencies and consensus ledgers, commonly referred to as <i>blockchains</i>, is receiving increasing interest from various different communities. These communities are very diverse and amongst others include: technical enthusiasts, activist groups, researchers from various disciplines, start ups, large enterprises, public authorities, banks, financial regulators, business men, investors, and also criminals. The scientific community adapted relatively slowly to this emerging and fast-moving field of cryptographic currencies and consensus ledgers. This was one reason that, for quite a while, the only resources available have been the Bitcoin source code, blog and forum posts, mailing lists, and other online publications. Also the original Bitcoin paper which initiated the hype was published online without any prior peer review. Following the original publication spirit of the Bitcoin paper, a lot of innovation in this field has repeatedly come from the community itself in the form of online publications and online conversations instead of established peer-reviewed scientific publishing. On the one side, this spirit of fast free software development, combined with the business aspects of cryptographic currencies, as well as the interests of today's time-to-market focused industry, produced a flood of publications, whitepapers, and prototypes. On the other side, this has led to deficits in systematization and a gap between practice and the theoretical understanding of this new field. This book aims to further close this gap and presentsa well-structured overview of this broad field from a technical viewpoint. The archetype for modern cryptographic currencies and consensus ledgers is Bitcoin and its underlying Nakamoto consensus. Therefore we describe the inner workings of this protocol in great detail and discuss its relations to other derived systems.
This book deals with "crypto-biometrics," a relatively new and multi-disciplinary area of research (started in 1998). Combining biometrics and cryptography provides multiple advantages, such as, revocability, template diversity, better verification accuracy, and generation of cryptographically usable keys that are strongly linked to the user identity. In this text, a thorough review of the subject is provided and then some of the main categories are illustrated with recently proposed systems by the authors. Beginning with the basics, this text deals with various aspects of crypto-biometrics, including review, cancelable biometrics, cryptographic key generation from biometrics, and crypto-biometric key sharing protocols. Because of the thorough treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. Table of Contents: Introduction / Cancelable Biometric System / Cryptographic Key Regeneration Using Biometrics / Biometrics-Based Secure Authentication Protocols / Concluding Remarks
This book is about the role of some engineering principles in our everyday lives. Engineers study these principles and use them in the design and analysis of the products and systems with which they work. The same principles play basic and influential roles in our everyday lives as well. Whether the concept of entropy, the moments of inertia, the natural frequency, the Coriolis acceleration, or the electromotive force, the roles and effects of these phenomena are the same in a system designed by an engineer or created by nature. This shows that learning about these engineering concepts helps us to understand why certain things happen or behave the way they do, and that these concepts are not strange phenomena invented by individuals only for their own use, rather, they are part of our everyday physical and natural world, but are used to our benefit by the engineers and scientists. Learning about these principles might also help attract more and more qualified and interested high school and college students to the engineering fields. Each chapter of this book explains one of these principles through examples, discussions, and at times, simple equations.
This book offers a comprehensive overview of the various concepts and research issues about blogs or weblogs. It introduces techniques and approaches, tools and applications, and evaluation methodologies with examples and case studies. Blogs allow people to express their thoughts, voice their opinions, and share their experiences and ideas. Blogs also facilitate interactions among individuals creating a network with unique characteristics. Through the interactions individuals experience a sense of community. We elaborate on approaches that extract communities and cluster blogs based on information of the bloggers. Open standards and low barrier to publication in Blogosphere have transformed information consumers to producers, generating an overwhelming amount of ever-increasing knowledge about the members, their environment and symbiosis. We elaborate on approaches that sift through humongous blog data sources to identify influential and trustworthy bloggers leveraging content and network information. Spam blogs or "splogs" are an increasing concern in Blogosphere and are discussed in detail with the approaches leveraging supervised machine learning algorithms and interaction patterns. We elaborate on data collection procedures, provide resources for blog data repositories, mention various visualization and analysis tools in Blogosphere, and explain conventional and novel evaluation methodologies, to help perform research in the Blogosphere. The book is supported by additional material, including lecture slides as well as the complete set of figures used in the book, and the reader is encouraged to visit the book website for the latest information: http: //tinyurl.com/mcp-agarwal Table of Contents: Modeling Blogosphere / Blog Clustering and Community Discovery / Influence and Trust / Spam Filtering in Blogosphere / Data Collection and Evaluation
This book provides a brief overview of the popular Finite Element Method (FEM) and its hybrid versions for electromagnetics with applications to radar scattering, antennas and arrays, guided structures, microwave components, frequency selective surfaces, periodic media, and RF materials characterizations and related topics. It starts by presenting concepts based on Hilbert and Sobolev spaces as well as Curl and Divergence spaces for generating matrices, useful in all engineering simulation methods. It then proceeds to present applications of the finite element and finite element-boundary integral methods for scattering and radiation. Applications to periodic media, metamaterials and bandgap structures are also included. The hybrid volume integral equation method for high contrast dielectrics and is presented for the first time. Another unique feature of the book is the inclusion of design optimization techniques and their integration within commercial numerical analysis packages for shape and material design. To aid the reader with the method's utility, an entire chapter is devoted to two-dimensional problems. The book can be considered as an update on the latest developments since the publication of our earlier book (Finite Element Method for Electromagnetics, IEEE Press, 1998). The latter is certainly complementary companion to this one.