Algorithms and data structures Books
Cambridge University Press How to Think about Algorithms
Book SynopsisThe second edition of this student-friendly textbook now includes over 150 new exercises, key concept summaries and a chapter on machine learning algorithms. Its approachability and clarity make it ideal as both a main course text or as a supplementary book for students who find other books challenging.
£28.49
APress Modern Deep Learning for Tabular Data
Book SynopsisDeep learning is one of the most powerful tools in the modern artificial intelligence landscape. While having been predominantly applied to highly specialized image, text, and signal datasets, this book synthesizes and presents novel deep learning approaches to a seemingly unlikely domain - tabular data. Whether for finance, business, security, medicine, or countless other domain, deep learning can help mine and model complex patterns in tabular data - an incredibly ubiquitous form of structured data.Part I of the book offers a rigorous overview of machine learning principles, algorithms, and implementation skills relevant to holistically modeling and manipulating tabular data. Part II studies five dominant deep learning model designs - Artificial Neural Networks, Convolutional Neural Networks, Recurrent Neural Networks, Attention and Transformers, and Tree-Rooted Networks - through both their ''default'' usage and their application to tabular data. Part III compounds the powTable of Contents○ Section 1: Machine Learning and Tabular Data ■ Chapter 1 – Introduction to Machine Learning ■ Chapter 2 – Data Tools ○ Section 2: Applied Deep Learning Architectures ■ Chapter 3 – Artificial Neural Networks ■ Chapter 4 – Convolutional Neural Networks ■ Chapter 5 – Recurrent Neural Networks ■ Chapter 6 – Attention Mechanism ■ Chapter 7 – Tree-based Neural Networks ○ Section 3: Deep Learning Design and Tools ■ Chapter 8 – Autoencoders ■ Chapter 9 – Data Generation ■ Chapter 10 – Meta-optimization ■ Chapter 11 – Multi-model arrangement ■ Chapter 12 – Deep Learning Interpretability ○ Appendix A
£41.24
APress Practical Business Analytics Using R and Python
Book SynopsisThis book illustrates how data can be useful in solving business problems. It explores various analytics techniques for using data to discover hidden patterns and relationships, predict future outcomes, optimize efficiency and improve the performance of organizations. You''ll learn how to analyze data by applying concepts of statistics, probability theory, and linear algebra. In this new edition, both R and Python are used to demonstrate these analyses. Practical Business Analytics Using R and Python also features new chapters covering databases, SQL, Neural networks, Text Analytics, and Natural Language Processing.Part one begins with an introduction to analytics, the foundations required to perform data analytics, and explains different analytics terms and concepts such as databases and SQL, basic statistics, probability theory, and data exploration. Part two introduces predictive models using statistical machine learning and discusses concepts like regression, classifiTable of ContentsSection 1: Introduction to AnalyticsIn this section, we discuss the necessary foundations required to perform data analytics. We discuss different analytics terms, basics statistics and probability theory, descriptive statistics including various plots, and various measures for evaluating your predictive models. Chapter 1: Business Analytics RevolutionChapter 2: Foundations of Business AnalyticsChapter 3: Structured Query Language (SQL) AnalyticsChapter 4: Business Analytics Process Chapter 5: Exploratory Data Analysis (EDA)Chapter 6: Evaluating Analytics Model PerformanceSection II: Supervised Learning and Predictive AnalyticsIn this section, we introduce statistical learning models and machine learning models. We present various regression analysis and classification analysis. We also discuss logistic regression and end our discussion by introducing Neural Network and gradient descent algorithms. Chapter 7: Simple Linear RegressionsChapter 8: Multiple Linear RegressionsChapter 9: ClassificationChapter 10: Neural NetworksChapter 11: Logistic RegressionSection III: Time series modelsIn this section, we introduce optimization models and Time series analysis. In time series, we discuss different forecasting models, and in optimization models, we introduce both linear and non-linear optimization models.Chapter 12: Time Series – ForecastingSection IV: Unsupervised model and Text MiningIn this section, we discuss two popular unsupervised models - cluster analysis and relationship data mining techniques. Finally, we end this section by introducing text mining and NLP and briefly introducing big data. Chapter 13: Cluster AnalysisChapter 14: Relationship Data MiningChapter 15: Mining Text and Text Analytics Chapter 16: Big Data and Big Data AnalyticsSection V: Business Analytics ToolsThis is the last part. In this section we This section summarizes what we have learned in the earlier section by working on some case studies. We work on practical cases using public datasets using both ‘R’ and ‘Python’.Chapter 17: R programming for AnalyticsChapter 18: Python Programming for Analytics
£41.24
APress Architecture of Advanced Numerical Analysis
Book SynopsisThis unique open access book applies the functional OCaml programming language to numerical or computational weighted data science, engineering, and scientific applications. This book is based on the authors' first-hand experience building and maintaining Owl, an OCaml-based numerical computing library.You'll first learn the various components in a modern numerical computation library. Then, you will learn how these components are designed and built up and how to optimize their performance. After reading and using this book, you'll have the knowledge required to design and build real-world complex systems that effectively leverage the advantages of the OCaml functional programming language. What You Will LearnOptimize core operations based on N-dimensional arraysDesign and implement an industry-level algorithmic differentiation moduleImplement mathematical optimization, regression, and deep neural network functionalities based on algorithmic differentiationDesign and optimize a compTable of ContentsPrologueA Brief HistoryReductionism vs. HolismKey FeaturesContact MePART 1: NUMERICAL TECHNIQUES1. IntroductionWhat Is Scientific ComputingWhat is Functional ProgrammingWho Is This Book ForStructure of the BookInstallationOption 1: Install from OPAMOption 2: Pull from Docker HubOption 3: Pin the Dev-RepoOption 4: Compile from SourceCBLAS/LAPACKE DependencyInteracting with OwlUsing ToplevelUsing NotebookUsing Owl-JupyterSummary2. ConventionsPure vs. ImpureNdarray vs. ScalarInfix OperatorsOperator ExtensionModule StructuresNumber and PrecisionPolymorphic FunctionsModule ShortcutsType Casting3. VisualisationCreate PlotsSpecificationSubplotsMultiple LinesLegendDrawing PatternsLine PlotScatter PlotStairs PlotBox PlotStem PlotArea PlotHistogram & CDF PlotLog Plot3D PlotAdvanced Statistical PlotSummaryReferences4. Mathematical FunctionsBasic FunctionsBasic Unary Math FunctionsBasic Binary FunctionsExponential and Logarithmic FunctionsTrigonometric FunctionsOther Math FunctionsSpecial FunctionsAiry FunctionsBessel FunctionsElliptic FunctionsGamma FunctionsBeta FunctionsStruve FunctionsZeta FunctionsError FunctionsIntegral FunctionsFactorialsInterpolation and ExtrapolationIntegrationUtility FunctionsSummary5. Statistical FunctionsRandom VariablesDiscrete Random VariablesContinuous Random VariablesDescriptive StatisticsOrder StatisticsSpecial DistributionGamma DistributionBeta DistributionChi-Square DistributionStudent-t DistributionCauchy DistributionMultiple VariablesSamplingHypothesis TestsTheoryGaussian Distribution in Hypothesis TestingTwo-Sample InferencesGoodness-of-fit TestsNon-parametric StatisticsCovariance and CorrelationsAnalysis of VarianceSummary6. N-Dimensional ArraysNdarray TypesCreation FunctionsProperties FunctionsMap FunctionsFold FunctionsScan FunctionsComparison FunctionsVectorised FunctionsIteration FunctionsManipulation FunctionsSerialisationTensorsSummaryReferences7. Slicing and BroadcastingSlicingBasic SlicingFancy SlicingConventions in DefinitionExtended OperatorsAdvanced UsageBroadcastingWhat Is Broadcasting?Shape ConstraintsSupported OperationsSlicing in NumPy and JuliaInternal MechanismSummary8. Linear AlgebraVectors and MatricesCreating MatricesAccessing ElementsIterate, Map, Fold, and FilterMath OperationsGaussian EliminationLU FactorisationInverse and TransposeVector SpacesRank and BasisOrthogonalitySolving Ax = bMatrix SensitivityDeterminantsEigenvalues and EigenvectorsSolving Ax=λ xComplex MatricesSimilarity Transformation and DiagonalisationPositive Definite MatricesPositive DefinitenessSingular Value DecompositionInternal: CBLAS and LAPACKELow-level Interface to CBLAS & LAPACKESparse MatricesSummaryReferences9. Ordinary Differential EquationsWhat Is An ODEExact SolutionsLinear SystemsSolving An ODE NumericallyOwl-ODEExample: Linear Oscillator SystemSolver StructureSymplectic SolversFeatures and LimitsExamples of using Owl-ODEExplicit ODETwo Body ProblemLorenz AttractorDamped OscillationStiffnessSolve Non-Stiff ODEsSolve Stiff ODEsSummaryReferences10. Signal ProcessingDiscrete Fourier TransformFast Fourier TransformExamplesApplications of FFTFind period of sunspotsDecipher the ToneImage ProcessingFilteringExample: SmoothingGaussian FilterSignal ConvolutionFFT and Image ConvolutionSummaryReferences11. Algorithmic DifferentiationChain RuleDifferentiation MethodsHow Algorithmic Differentiation WorksForward ModeReverse ModeForward or Reverse?A Strawman AD EngineSimple Forward ImplementationSimple Reverse ImplementationUnified ImplementationsForward and Reverse Propagation APIExpressing ComputationExample: Forward ModeExample: Reverse ModeHigh-Level APIsDerivative and GradientJacobianHessian and LaplacianOther APIsInternal of Algorithmic DifferentiationGo Beyond Simple ImplementationExtend AD moduleLazy EvaluationSummaryReferences12. OptimisationIntroductionRoot FindingUnivariate Function OptimisationUse DerivativesGolden Section SearchMultivariate Function OptimisationNelder-Mead Simplex MethodGradient Descent MethodsConjugate Gradient MethodNewton and Quasi-Newton MethodsGlobal Optimisation and Constrained OptimisationSummaryReferences13. RegressionLinear RegressionProblem: Where to locate a new McDonald’s restaurant?Cost FunctionSolving Problem with Gradient DescentMultiple RegressionFeature NormalisationAnalytical SolutionNon-linear regressionsRegularisationOls, Ridge, Lasso, and Elastic_netLogistic RegressionSigmoid FunctionCost FunctionExampleMulti-class classificationSupport Vector MachineKernel and Non-linear BoundaryExampleModel error and selectionError MetricsModel SelectionSummaryReferences14. Deep Neural NetworksPerceptronYet Another RegressionModel RepresentationForward PropagationBack propagationFeed Forward NetworkLayersActivation FunctionsInitialisationTrainingTestNeural Network ModuleModule StructureNeuronsNeural GraphTraining ParametersConvolutional Neural NetworkRecurrent Neural NetworkLong Short Term Memory (LSTM)Generative Adversarial NetworkSummaryReferences15. Natural Language ProcessingIntroductionText CorpusStep-by-step OperationUse the Corpus ModuleVector Space ModelsBag of Words (BOW)Term Frequency–Inverse Document Frequency (TF-IDF)Latent Dirichlet Allocation (LDA)ModelsDirichlet DistributionGibbs SamplingTopic Modelling ExampleLatent Semantic Analysis (LSA)Search Relevant DocumentsEuclidean and Cosine SimilarityLinear SearchingSummaryReferences16. Dataframe for Tabular DataBasic ConceptsCreate FramesManipulate FramesQuery FramesIterate, Map, and FilterRead/Write CSV FilesInfer Type and SeparatorSummary17. Symbolic RepresentationIntroductionDesignCore abstractionEnginesONNX EngineExample 1: Basic operationsExample 2: Variable InitialisationExample 3: Neural networkLaTeX EngineOwl EngineSummary18. Probabilistic ProgrammingGenerative Model vs Discriminative ModelBayesian NetworksSampling TechniquesInferencePART 2: SYSTEM ARCHITECTURE19. Architecture OverviewIntroductionArchitecture OverviewCore ImplementationN-dimensional ArrayInterfaced LibrariesAdvanced FunctionalityComputation GraphAlgorithmic DifferentiationRegressionNeural NetworkParallel ComputingActor EngineGPU ComputingOpenMPCommunity-Driven R&DSummary20. Core OptimisationBackgroundNumerical LibrariesOptimisation of Numerical ComputationInterfacing to C CodeNdarray OperationsFrom OCaml to COptimisation TechniquesMap OperationsConvolution OperationsReduction OperationsRepeat OperationsSummaryReferences21. Automatic Empirical TuningWhat is Parameter TuningWhy Parameter Tuning in OwlHow to Tune OpenMP ParametersMake a DifferenceSummary22. Computation GraphIntroductionWhat is a Computation Graph?From Dynamic to StaticSignificance in ComputingExamplesExample 01: Basic CGraphExample 02: CGraph with ADExample 03: CGraph with DNNDesign RationaleOptimisation of CGraphOptimising memory with pebblesAllocation AlgorithmAs Intermediate RepresentationsSummary23. Scripting and Zoo SystemIntroductionShare Script with ZooTypical ScenarioCreate a ScriptShare via GistImport in Another ScriptSelect a Specific VersionCommand Line ToolMore ExamplesSystem DesignServicesType CheckingBackendDomain Specific LanguageService DiscoveryUse CaseSummaryReferences24. Compiler BackendsBase LibraryBackend: JavaScriptUse Native OCamlUse Facebook ReasonBackend: MirageOSMirageOS and UnikernelExample: Gradient DescentExample: Neural NetworkEvaluationSummary25. Distributed ComputingActor SystemDesignActor EnginesMap-Reduce EngineParameter Server EnginePeer-to-Peer EngineClassic Synchronise ParallelBulk Synchronous ParallelAsynchronous ParallelStale Synchronous ParallelProbabilistic Synchronise ParallelBasic idea: samplingCompatibilityBarrier Trade-off DimensionsConvergenceA Distributed Training ExampleStep ProgressAccuracySummaryReferences26. Testing FrameworkUnit TestExampleWhat Could Go WrongCorner CasesTest CoverageUse FunctorSummary27. Constants and Metric SystemWhat Is a Metric SystemFour Metric SystemsSI PrefixExample: Physics and Math constantsInternational System of UnitsTimeLengthAreaVolumeSpeedMassForceEnergyPowerPressureViscosityLuminanceRadioactivity28. Internal Utility ModulesDataset ModuleMNISTCIFAR-10Graph ModuleStack and Heap ModulesCount-Min SketchSummaryPART 3: CASE STUDIES29. Case - Image RecognitionBackgroundLeNetAlexNetVGGResNetSqueezeNetCapsule NetworkBuilding InceptionV3 NetworkInceptionV1 and InceptionV2FactorisationGrid Size ReductionInceptionV3 ArchitecturePreparing WeightsProcessing ImageRunning InferenceApplicationsSummaryReferences30. Case - Instance SegmentationIntroductionMask R-CNN NetworkBuilding Mask R-CNNFeature ExtractorProposal GenerationClassificationRun the CodeSummaryReferences31. Case - Neural Style TransferContent and StyleContent ReconstructionStyle RecreationCombining Content and StyleRunning NSTExtending NSTFast Style TransferBuilding FST NetworkRunning FSTSummaryReferences32. Case - Recommender SystemIntroductionArchitectureBuild Topic ModelsIndex Text CorpusRandom ProjectionOptimising Vector StorageOptimise Data StructureOptimise Index AlgorithmSearch ArticlesCode ImplementationMake It LiveSummaryReferences33. Case - Applications in FinanceIntroductionBond PricingBlack-Scholes ModelMathematical ModelOption PricingPortfolio OptimisationMathematical ModelEfficient FrontierMaximise Sharpe Ratio
£33.74
APress Make Your Data Speak
Book SynopsisTable of ContentsIntroduction. Three stories that made me write this bookChapter 1. Data preparation 1.1 Analyzing and transforming the original data 1.2 Preparing the basis for a dashboard 1.3 Making data samples for visualizations 1.4. Setting up an interactivity 1.5. Summary and conclusions of the chapter, quick tricks Chapter 2. Dashboard assembling 2.1 Assembling a dashboard according to the layout 2.2 Creating KPI cards 2.3 Aligning a dashboard, adding a header 2.4 Summary and conclusions of the chapter, quick tricks Chapter 3. Anatomy of diagrams 3.1 Analyzing ready-made design styles 3.2 Setting up data labels 3.3 Working with the text: remove the excess, add the necessary 3.4 Designing bar charts 3.5 Setting up the chart template 3.6 Summary and conclusions of the chapter, quick tricks Chapter 4. Final dashboard design 4.1 Aligning the headers to the grid 4.2 Creating new cards on the top of the cells 4.3 Making interactive slicers 4.4 Working with Excel colors and fonts 4.5 Improving standard Excel themes 4.6. Summary and conclusions of the chapter, quick tricks Chapter 5. Corporate identity 5.1 Creating a theme in accordance with the brandbook 5.2 Adapting the theme according to the checklist 5.3 Creating a dashboard in a dark theme 5.4 Summary and conclusions of the chapter, quick tricks Chapter 6. Data visualization rules 6.1 Types of data analysis 6.2 How to choose charts 6.3 Life hacks for multiple data series 6.4 When you need everything at once 6.5 Funnel and waterfall 6.6 Summary and conclusions of the chapter, quick tricks Conclusion
£41.24
APress Building Responsible AI Algorithms
Book SynopsisThis book introduces a Responsible AI framework and guides you through processes to apply at each stage of the machine learning (ML) life cycle, from problem definition to deployment, to reduce and mitigate the risks and harms found in artificial intelligence (AI) technologies. AI offers the ability to solve many problems today if implemented correctly and responsibly. This book helps you avoid negative impacts that in some cases have caused loss of life and develop models that are fair, transparent, safe, secure, and robust. The approach in this book raises your awareness of the missteps that can lead to negative outcomes in AI technologies and provides a Responsible AI framework to deliver responsible and ethical results in ML. It begins with an examination of the foundational elements of responsibility, principles, and data. Next comes guidance on implementation addressing issues such as fairness, transparency, safety, privacy, and robustness. The book helps you think responsiblTable of ContentsIntroductionPart I. Foundation1. Responsibility2. AI Principles3. DataPart II. Implementation4. Responsible AI Framework5. Fairness6. Safety7. Humans in the Loop8. Transparency9. Privacy and RobustnessPart III. Ethical Considerations10. Ethics of AI and MLReferences
£25.19
De Gruyter Practical AI for Business Leaders Product
Book SynopsisMost economists agree that AI is a general purpose technology (GPT) like the steam engine, electricity, and the computer. AI will drive innovation in all sectors of the economy for the foreseeable future. Practical AI for Business Leaders, Product Managers, and Entrepreneurs is a technical guidebook for the business leader or anyone responsible for leading AI-related initiatives in their organization. The book can also be used as a foundation to explore the ethical implications of AI. Authors Alfred Essa and Shirin Mojarad provide a gentle introduction to foundational topics in AI. Each topic is framed as a triad: concept, theory, and practice. The concept chapters develop the intuition, culminating in a practical case study. The theory chapters reveal the underlying technical machinery. The practice chapters provide code in Python to implement the models discussed in the case study. With this book, readers will learn: The technical foundations of machine learning and deep leaTable of Contents Introduction What is AI and why it is at the center of major business transformation? How is it related to machine learning? What is deep learning, and how is it related to ML? Why is it important? How the book is organized Who is the audience? Section 1: Machine Learning Chapter 1.1, introduction, machine learning, different types of machine learning Chapter 1.2, Machine Learning Technical Overview Chapter 1.3, Hands-On Machine Learning with Scikit Learn Chapter 1.4, Advanced Topics/flavors of Machine learning Appendix: mathematical interlude Section 2: Deep Learning Chapter 2.1, introduction (what is it, why is it important) Chapter 2.2, Deep Learning Technical Overview Chapter 2.3, Hands-On Deep Learning with Keras Chapter 2.4, Advanced Topics/flavors of deep learning Appendix: mathematical interlude Section 3: Putting AI into Practice: Innovation Framework Chapter 3.1: Diffusion and Dynamics of Innovation Chapter 3.2: Managing an Innovation Portfolio
£40.95
De Gruyter Random Number GeneratorsPrinciples and Practices
Book SynopsisRandom Number Generators, Principles and Practices has been written for programmers, hardware engineers, and sophisticated hobbyists interested in understanding random numbers generators and gaining the tools necessary to work with random number generators with confidence and knowledge. Using an approach that employs clear diagrams and running code examples rather than excessive mathematics, random number related topics such as entropy estimation, entropy extraction, entropy sources, PRNGs, randomness testing, distribution generation, and many others are exposed and demystified. If you have ever Wondered how to test if data is really random Needed to measure the randomness of data in real time as it is generated Wondered how to get randomness into your programs Wondered whether or not a random number generator is trustworthy Wanted to be able to choose between random number generator solutions Needed to turn uniform random data into a different distribution NeededTable of Contents1 Introduction 1.1 Tools 1.2 Terminology 1.3 The Many Types of Random Numbers 1.3.1 Uniform Random Numbers 2 Random Number Generators 2.1 Classes of Random Number Generators 2.2 Names for RNGs 3 Making Random Numbers 3.1 A Quick Overview of the RNG Types 3.2 The Structure of Full RNG Implementations 3.3 Pool Extractor Structures 3.4 Multiple Input Extractors 4 Physically Uncloneable Functions 21 4.1 The other kind âAS Static vs. Dynamic Random Number Generators . 5 Testing Random Numbers 5.1 Known Answer Tests 5.2 Distinguishing From Random 5.3 PRNG Test Suites 5.4 Entropy Measurements 5.5 Min Entropy Estimation 5.6 Model Equivalence Testing 5.7 Statistical Prerequisite Testing 5.8 The problem Distinguishing Entropy and Pseudo-randomness 5.9 PRNG Tests: DieHarder, NIST SP800-22,TestU01, China ICS 35.040 5.10 Entropy Measurements 5.11 Min Entropy Measurements 5.12 Modeling to Test a Source 5.13 Statistical Prerequisites 5.14 Testing for bias . 5.15 results that are âAŸtoo goodâAZ (E.G. Chi-square == 0.5) 5.16 Distinguishing Correlation from Bias 5.17 Testing for Stationary properties 5.18 FFT analysis 5.19 Online Testing 5.20 Working From the Source RNG 5.21 Tools 5.22 Summary 6 Entropy Extraction or Distillation 6.1 A simple extractor, the XOR gate 6.2 A simple way of improving the distribution of random numbers that have known missing values using XOR 7 Quantifying Entropy 7.1 Rényi Entropy 7.2 Distance From Uniform Topics to put somewhere in the book- in existing chapters and new chapters 8.1 XOR as a 2 bit extractor 8.2 Properties of real random numbers 8.3 Binomial distributions 8.4 Normal distributions 8.4.1 Dice, more dice 8.4.2 Central limit theorem 8.5 Seeing patterns 8.6 Regression to the mean 8.7 Lack of correlation, bias, algorithmic connections, predictability 8.8 What’s a True random number? 8.9 Random numbers in cryptography 8.10 Things they help with liveness, unpredictability, resistance to attacks 8.11 Examples of use 8.11.1 Salting Passwords . 8.11.2 802.11i exchange 8.11.3 PKMv2 exchange 8.11.4 Making Keys 8.12 Examples of RNG crypto failures 8.12.1 Sony PS3 attack 8.12.2 MiFare Classic 8.12.3 Online Poker 8.12.4 Debian OpenSSL Fiasco 8.12.5 Linux Boot Time Entropy 8.13 Humans and random numbers 8.14 Result of asking people for a random number 8.14.1 Normal People 8.14.2 Crypto People 8.15 Mental Random Number Tricks 8.15.1 How to think of a really random number 8.16 PRNGs 8.17 extractors 8.17.1 CBC MAC 8.17.2 BIW 8.17.3 Von Neumann 8.18 Extractor Theory 8.19 Random Number Standards 8.19.1 SP800-90A B C . 8.19.2 Ansi X9.82 8.20 PRNG Algorithms 8.20.1 SP800-90A CTR DRBG 8.20.2 SP800-90A SHA DRBG 8.20.3 XOR Construction 8.20.4 Oversampling Construction 8.21 Yarrow 8.22 Whirlpool 8.23 Linux Kernel random service 8.24 Appendices 8.25 Resources 8.25.1 SW Sources 8.25.2 Online random number sources 8.26 Example Algorithm Vectors 8.26.1 SP800-90A CTR DRBG 128 & 256 8.26.2 SP800-90A Hash DRBG SHA-1 & SHA 256 8.26.3 AES-CBC-MAC Conditioner 128 8.26.4 AES-CBC-MAC Conditioner 8.27 SP800-90 LZ Tests Issues
£48.38
Nova Science Publishers Inc Embedded Cryptographic Hardware: Methodologies &
Book SynopsisModern cryptology, which is the basis of information security techniques, started in the late 70''s and developed in the 80''s. As communication networks were spreading deep into society, the need for secure communication greatly promoted cryptographic research. The need for fast but secure cryptographic systems is growing bigger. Therefore, dedicated systems for cryptography are becoming a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, hardware implementations of cryptographic algorithms become cost-effective. The focus of this book is on all aspects of embedded cryptographic hardware. Of special interest are contributions that describe new secure and fast hardware implementations and new efficient algorithms, methodologies and protocols for secure communications. This book is organised in two parts. The first part is dedicated to embedded hardware of cryptosystems while the second part focuses on new algorithms for cryptography, design methodologies and secure protocols.
£129.74
Nova Science Publishers Inc Trends in Computer Science
Book Synopsis
£143.24
Nova Science Publishers Inc Algorithms & Tools for Parallel Computing on
Book SynopsisThis book features chapters which explore algorithms, programming languages, systems, tools and theoretical models aimed at high performance computing on heterogeneous networks of computers.
£73.49
Nova Science Publishers Inc Flexible Text Searching
Book Synopsis
£39.74
Nova Science Publishers Inc Measuring Power of Algorithms, Programs &
Book SynopsisWe are living in a world where complexity of systems created and studied by people grows beyond all imaginable limits. Computers, their software and their networks are among the most complicated systems of our time. Science is the only efficient tool for dealing with this overwhelming complexity. One of the methodologies developed in science is the axiomatic approach. It proved to be very powerful in mathematics. In this book, the authors developed further an axiomatic approach in computer science initiated by Floyd, Manna, Blum and other researchers. In the traditional constructive setting, different classes of algorithms (programs, processes or automata) are studied separately, with some indication of relations between these classes. In such a way, the constructive approach gave birth to the theory of Turing machines, theory of partial recursive functions, theory of finite automata, and other theories of constructive models of algorithms. The axiomatic context allows one to research collections of classes of algorithms, automata, and processes. These classes are united in a collection by common properties in a form of axioms. As a result, axiomatic approach goes higher in the hierarchy of computer and network models, reducing in such a way complexity of their study.
£92.99
Manning Publications RabitMQ in Depth
Book SynopsisDESCRIPTION Any large application needs an efficient way to handle the constant messages passing between components in the system. Billed as "messaging that just works," the RabbitMQ message broker initially appeals to developers because it's lightweight, easy to set up, and low maintenance. They stick with it because it's powerful, fast, and up to nearly anything that can be thrown at it. This book takes readers beyond the basics and explores the challenges of clustering and distributing messages across enterprise-level data-centers using RabbitMQ. RabbitMQ in Depth is a practical guide to building and maintaining message-based systems. This book covers detailed architectural and operational use of RabbitMQ with an emphasis on not just how it works but why it works the way it does. It provides examples and detailed explanations of everything from low-level communication to integration with third-party systems. It also offers insights needed to make core architectural choices and develop procedures for effective operational management. KEY FEATURES Approachable detailed resource Explains the "how" and "why" of RabbitMQ Takes readers well beyond the basics AUDIENCE Written for programmers with a basic understanding of messaging oriented systems and RabbitMQ. ABOUT THE TECHNOLOGY RabbitMQ is an open-source message broker software that programs can use to exchange messages with each other to create scalable and reliable application architectures.
£43.19
Manning Publications Succeeding with AI
Book SynopsisThe big challenge for a successful AI project isn’t deciding which problems you can solve. It’s deciding which problems you should solve. In Managing Successful AI Projects, author and AI consultant Veljko Krunic reveals secrets for succeeding in AI that he developed with Fortune 500 companies, early-stage start-ups, and other business across multiple industries. Key Features · Selecting the right AI project to meet specific business goals · Economizing resources to deliver the best value for money · How to measure the success of your AI efforts in the business terms · Predict if you are you on the right track to deliver your intended business results For executives, managers, team leaders, and business-focused data scientists. No specific technical knowledge or programming skills required. About the technology Companies small and large are initiating AI projects, investing vast sums of money on software, developers, and data scientists. Too often, these AI projects focus on technology at the expense of actionable or tangible business results, resulting in scattershot results and wasted investment. Managing Successful AI Projects sets out a blueprint for AI projects to ensure they are predictable, successful, and profitable. It’s filled with practical techniques for running data science programs that ensure they’re cost effective and focused on the right business goals. Veljko Krunic is an independent data science consultant who has worked with companies that range from start-ups to Fortune 10 enterprises. He holds a PhD in Computer Science and an MS in Engineering Management, both from the University of Colorado at Boulder. He is also a Six Sigma Master Black Belt.
£35.99
Manning Publications Learn Concurrent Programming with Go
Book SynopsisWrite concurrent code in Go that improves application performance, scales up to handle bigger loads, and takes full advantage of modern multi-processor hardware. Suitable for programmers who already know the basics of Go or another C-style language. No experience in concurrent programming required. In Learn Concurrent Programming with Go you will learn how to: Implement effective concurrency for more responsive, higher performing, scalable software Avoid common concurrency problems such as deadlocks and race conditions Manage concurrency using goroutines, mutexes, readers-writer locks, and more Identify concurrency patterns such as pipelining, worker pools, and message passing Discover advantages, limits and properties of parallel computing Improve your Go coding skills with advanced multithreading About the technology Concurrent programming is essential for getting the most out of modern multi-processor computer hardware. It allows multiple tasks to execute and interact simultaneously, speeding up performance and reducing user wait time. Thanks to its baked-in concurrency models, Google's Go is one of the best languages you can use to learn and apply concurrent programming to your systems.
£45.99
MIT Press Ltd Distributed Algorithms An Intuitive Approach
Book Synopsis
£40.00
Manning Publications Machine Learning Algorithms in Depth
Book SynopsisDevelop a mathematical intuition around machine learning algorithms to improve model performance and effectively troubleshoot complex ML problems. For intermediate machine learning practitioners familiar with linear algebra, probability, and basic calculus. Machine Learning Algorithms in Depth dives into the design and underlying principles of some of the most exciting machine learning (ML) algorithms in the world today. With a particular emphasis on probability-based algorithms, you will learn the fundamentals of Bayesian inference and deep learning. You will also explore the core data structures and algorithmic paradigms for machine learning. You will explore practical implementations of dozens of ML algorithms, including: Monte Carlo Stock Price Simulation Image Denoising using Mean-Field Variational Inference EM algorithm for Hidden Markov Models Imbalanced Learning, Active Learning and Ensemble Learning Bayesian Optimisation for Hyperparameter Tuning Dirichlet Process K-Means for Clustering Applications Stock Clusters based on Inverse Covariance Estimation Energy Minimisation using Simulated Annealing Image Search based on ResNet Convolutional Neural Network Anomaly Detection in Time-Series using Variational Autoencoders Each algorithm is fully explored with both math and practical implementations so you can see how they work and put into action. About the technology Fully understanding how machine learning algorithms function is essential for any serious ML engineer. This vital knowledge lets you modify algorithms to your specific needs, understand the trade-offs when picking an algorithm for a project, and better interpret and explain your results to your stakeholders. This unique guide will take you from relying on one-size-fits-all ML libraries to developing your own algorithms to solve your business needs.
£51.84
MIT Press Ltd Algorithms Unlocked
Book SynopsisFor anyone who has ever wondered how computers solve problems, an engagingly written guide for nonexperts to the basics of computer algorithms.Have you ever wondered how your GPS can find the fastest way to your destination, selecting one route from seemingly countless possibilities in mere seconds? How your credit card account number is protected when you make a purchase over the Internet? The answer is algorithms. And how do these mathematical formulations translate themselves into your GPS, your laptop, or your smart phone? This book offers an engagingly written guide to the basics of computer algorithms. In Algorithms Unlocked, Thomas Cormen—coauthor of the leading college textbook on the subject—provides a general explanation, with limited mathematics, of how algorithms enable computers to solve problems.Readers will learn what computer algorithms are, how to describe them, and how to evaluate them. They will discover simple ways to search for information in a computer; methods for rearranging information in a computer into a prescribed order (“sorting”); how to solve basic problems that can be modeled in a computer with a mathematical structure called a “graph” (useful for modeling road networks, dependencies among tasks, and financial relationships); how to solve problems that ask questions about strings of characters such as DNA structures; the basic principles behind cryptography; fundamentals of data compression; and even that there are some problems that no one has figured out how to solve on a computer in a reasonable amount of time.
£26.10
Pearson Education (US) Data Structures and Algorithm Analysis in C
Book SynopsisMark Allen Weiss is Professor and Associate Director for the School of Computing and Information Sciences at Florida International University. He is also currently serving as both Director of Undergraduate Studies and Director of Graduate Studies. He received his Bachelor's Degree in Electrical Engineering from the Cooper Union in 1983, and his Ph.D. in Computer Science from Princeton University in 1987, working under Bob Sedgewick. He has been at FIU since 1987 and was promoted to Professor in 1996. His interests include data structures, algorithms, and education. He is most well-known for his highly-acclaimed Data Structures textbooks, which have been used for a generation by roughly a million students. Professor Weiss is the author of numerous publications in top-rated journals and was recipient of the University's Excellence in Research Award in 1994. In 1996 at FIU he was the first in the world to teach Data Structures using the Java programming language, which is now thTable of ContentsChapter 1 Programming: A General Overview 1 1.1 What’s This Book About? 1 1.2 Mathematics Review 2 1.2.1 Exponents 3 1.2.2 Logarithms 3 1.2.3 Series 4 1.2.4 Modular Arithmetic 5 1.2.5 The P Word 6 1.3 A Brief Introduction to Recursion 8 1.4 C++ Classes 12 1.4.1 Basic class Syntax 12 1.4.2 Extra Constructor Syntax and Accessors 13 1.4.3 Separation of Interface and Implementation 16 1.4.4 vector and string 19 1.5 C++ Details 21 1.5.1 Pointers 21 1.5.2 Lvalues, Rvalues, and References 23 1.5.3 Parameter Passing 25 1.5.4 Return Passing 27 1.5.5 std::swap and std::move 29 1.5.6 The Big-Five: Destructor, Copy Constructor, Move Constructor, Copy Assignment operator=, Move Assignment operator= 30 1.5.7 C-style Arrays and Strings 35 1.6 Templates 36 1.6.1 Function Templates 37 1.6.2 Class Templates 38 1.6.3 Object, Comparable, and an Example 39 1.6.4 Function Objects 41 1.6.5 Separate Compilation of Class Templates 44 1.7 Using Matrices 44 1.7.1 The Data Members, Constructor, and Basic Accessors 44 1.7.2 operator[] 45 1.7.3 Big-Five 46 Summary 46 Exercises 46 References 48 Chapter 2 Algorithm Analysis 51 2.1 Mathematical Background 51 2.2 Model 54 2.3 What to Analyze 54 2.4 Running-Time Calculations 57 2.4.1 A Simple Example 58 2.4.2 General Rules 58 2.4.3 Solutions for the Maximum Subsequence Sum Problem 60 2.4.4 Logarithms in the Running Time 66 2.4.5 Limitations of Worst Case Analysis 70 Summary 70 Exercises 71 References 76 Chapter 3 Lists, Stacks, and Queues 77 3.1 Abstract Data Types (ADTs) 77 3.2 The List ADT 78 3.2.1 Simple Array Implementation of Lists 78 3.2.2 Simple Linked Lists 79 3.3 vector and list in the STL 80 3.3.1 Iterators 82 3.3.2 Example: Using erase on a List 83 3.3.3 const_iterators 84 3.4 Implementation of vector 86 3.5 Implementation of list 91 3.6 The Stack ADT 103 3.6.1 Stack Model 103 3.6.2 Implementation of Stacks 104 3.6.3 Applications 104 3.7 The Queue ADT 112 3.7.1 Queue Model 113 3.7.2 Array Implementation of Queues 113 3.7.3 Applications of Queues 115 Summary 116 Exercises 116 Chapter 4 Trees 121 4.1 Preliminaries 121 4.1.1 Implementation of Trees 122 4.1.2 Tree Traversals with an Application 123 4.2 Binary Trees 126 4.2.1 Implementation 128 4.2.2 An Example: Expression Trees 128 4.3 The Search Tree ADT–Binary Search Trees 132 4.3.1 contains 134 4.3.2 findMin and findMax 135 4.3.3 insert 136 4.3.4 remove 139 4.3.5 Destructor and Copy Constructor 141 4.3.6 Average-Case Analysis 141 4.4 AVL Trees 144 4.4.1 Single Rotation 147 4.4.2 Double Rotation 149 4.5 Splay Trees 158 4.5.1 A Simple Idea (That Does Not Work) 158 4.5.2 Splaying 160 4.6 Tree Traversals (Revisited) 166 4.7 B-Trees 168 4.8 Sets and Maps in the Standard Library 173 4.8.1 Sets 173 4.8.2 Maps 174 4.8.3 Implementation of set and map 175 4.8.4 An Example That Uses Several Maps 176 Summary 181 Exercises 182 References 189 Chapter 5 Hashing 193 5.1 General Idea 193 5.2 Hash Function 194 5.3 Separate Chaining 196 5.4 Hash Tables without Linked Lists 201 5.4.1 Linear Probing 201 5.4.2 Quadratic Probing 202 5.4.3 Double Hashing 207 5.5 Rehashing 208 5.6 Hash Tables in the Standard Library 210 5.7 Hash Tables with Worst-Case O(1) Access 212 5.7.1 Perfect Hashing 213 5.7.2 Cuckoo Hashing 215 5.7.3 Hopscotch Hashing 224 5.8 Universal Hashing 230 5.9 Extendible Hashing 233 Summary 236 Exercises 238 References 242 Chapter 6 Priority Queues (Heaps) 245 6.1 Model 245 6.2 Simple Implementations 246 6.3 Binary Heap 247 6.3.1 Structure Property 247 6.3.2 Heap-Order Property 248 6.3.3 Basic Heap Operations 249 6.3.4 Other Heap Operations 252 6.4 Applications of Priority Queues 257 6.4.1 The Selection Problem 258 6.4.2 Event Simulation 259 6.5 d-Heaps 260 6.6 Leftist Heaps 261 6.6.1 Leftist Heap Property 261 6.6.2 Leftist Heap Operations 262 6.7 Skew Heaps 269 6.8 Binomial Queues 271 6.8.1 Binomial Queue Structure 271 6.8.2 Binomial Queue Operations 271 6.8.3 Implementation of Binomial Queues 276 6.9 Priority Queues in the Standard Library 283 Summary 283 Exercises 283 References 288 Chapter 7 Sorting 291 7.1 Preliminaries 291 7.2 Insertion Sort 292 7.2.1 The Algorithm 292 7.2.2 STL Implementation of Insertion Sort 293 7.2.3 Analysis of Insertion Sort 294 7.3 A Lower Bound for Simple Sorting Algorithms 295 7.4 Shellsort 296 7.4.1 Worst-Case Analysis of Shellsort 297 7.5 Heapsort 300 7.5.1 Analysis of Heapsort 301 7.6 Mergesort 304 7.6.1 Analysis of Mergesort 306 7.7 Quicksort 309 7.7.1 Picking the Pivot 311 7.7.2 Partitioning Strategy 313 7.7.3 Small Arrays 315 7.7.4 Actual Quicksort Routines 315 7.7.5 Analysis of Quicksort 318 7.7.6 A Linear-Expected-Time Algorithm for Selection 321 7.8 A General Lower Bound for Sorting 323 7.8.1 Decision Trees 323 7.9 Decision-Tree Lower Bounds for Selection Problems 325 7.10 Adversary Lower Bounds 328 7.11 Linear-Time Sorts: Bucket Sort and Radix Sort 331 7.12 External Sorting 336 7.12.1 Why We Need New Algorithms 336 7.12.2 Model for External Sorting 336 7.12.3 The Simple Algorithm 337 7.12.4 Multiway Merge 338 7.12.5 Polyphase Merge 339 7.12.6 Replacement Selection 340 Summary 341 Exercises 341 References 347 Chapter 8 The Disjoint Sets Class 351 8.1 Equivalence Relations 351 8.2 The Dynamic Equivalence Problem 352 8.3 Basic Data Structure 353 8.4 Smart Union Algorithms 357 8.5 Path Compression 360 8.6 Worst Case for Union-by-Rank and Path Compression 361 8.6.1 Slowly Growing Functions 362 8.6.2 An Analysis by Recursive Decomposition 362 8.6.3 An O( M log *N ) Bound 369 8.6.4 An O( M α(M, N) ) Bound 370 8.7 An Application 372 Summary 374 Exercises 375 References 376 Chapter 9 Graph Algorithms 379 9.1 Definitions 379 9.1.1 Representation of Graphs 380 9.2 Topological Sort 382 9.3 Shortest-Path Algorithms 386 9.3.1 Unweighted Shortest Paths 387 9.3.2 Dijkstra’s Algorithm 391 9.3.3 Graphs with Negative Edge Costs 400 9.3.4 Acyclic Graphs 400 9.3.5 All-Pairs Shortest Path 404 9.3.6 Shortest Path Example 404 9.4 Network Flow Problems 406 9.4.1 A Simple Maximum-Flow Algorithm 408 9.5 Minimum Spanning Tree 413 9.5.1 Prim’s Algorithm 414 9.5.2 Kruskal’s Algorithm 417 9.6 Applications of Depth-First Search 419 9.6.1 Undirected Graphs 420 9.6.2 Biconnectivity 421 9.6.3 Euler Circuits 425 9.6.4 Directed Graphs 429 9.6.5 Finding Strong Components 431 9.7 Introduction to NP-Completeness 432 9.7.1 Easy vs. Hard 433 9.7.2 The Class NP 434 9.7.3 NP-Complete Problems 434 Summary 437 Exercises 437 References 445 Chapter 10 Algorithm Design Techniques 449 10.1 Greedy Algorithms 449 10.1.1 A Simple Scheduling Problem 450 10.1.2 Huffman Codes 453 10.1.3 Approximate Bin Packing 459 10.2 Divide and Conquer 467 10.2.1 Running Time of Divide-and-Conquer Algorithms 468 10.2.2 Closest-Points Problem 470 10.2.3 The Selection Problem 475 10.2.4 Theoretical Improvements for Arithmetic Problems 478 10.3 Dynamic Programming 482 10.3.1 Using a Table Instead of Recursion 483 10.3.2 Ordering Matrix Multiplications 485 10.3.3 Optimal Binary Search Tree 487 10.3.4 All-Pairs Shortest Path 491 10.4 Randomized Algorithms 494 10.4.1 Random-Number Generators 495 10.4.2 Skip Lists 500 10.4.3 Primality Testing 503 10.5 Backtracking Algorithms 506 10.5.1 The Turnpike Reconstruction Problem 506 10.5.2 Games 511 Summary 518 Exercises 518 References 527 Chapter 11 Amortized Analysis 533 11.1 An Unrelated Puzzle 534 11.2 Binomial Queues 534 11.3 Skew Heaps 539 11.4 Fibonacci Heaps 541 11.4.1 Cutting Nodes in Leftist Heaps 542 11.4.2 Lazy Merging for Binomial Queues 544 11.4.3 The Fibonacci Heap Operations 548 11.4.4 Proof of the Time Bound 549 11.5 Splay Trees 551 Summary 555 Exercises 556 References 557 Chapter 12 Advanced Data Structures and Implementation 559 12.1 Top-Down Splay Trees 559 12.2 Red-Black Trees 566 12.2.1 Bottom-Up Insertion 567 12.2.2 Top-Down Red-Black Trees 568 12.2.3 Top-Down Deletion 570 12.3 Treaps 576 12.4 Suffix Arrays and Suffix Trees 579 12.4.1 Suffix Arrays 580 12.4.2 Suffix Trees 583 12.4.3 Linear-Time Construction of Suffix Arrays and Suffix Trees 586 12.5 k-d Trees 596 12.6 Pairing Heaps 602 Summary 606 Exercises 608 References 612 Appendix A Separate Compilation of Class Templates 615 A.1 Everything in the Header 616 A.2 Explicit Instantiation 616 Index 619
£150.11
Manning Publications Machine Learning with TensorFlow
Book SynopsisDESCRIPTION Being able to make near-real-time decisions is becoming increasingly crucial. To succeed, we need machine learning systems that can turn massive amounts of data into valuable insights. But when you're just starting out in the data science field, how do you get started creating machine learning applications? The answer is TensorFlow, a new open source machine learning library from Google. The TensorFlow library can take your high level designs and turn them into the low level mathematical operations required by machine learning algorithms. Machine Learning with TensorFlow teaches readers about machine learning algorithms and how to implement solutions with TensorFlow. It starts with an overview of machine learning concepts and moves on to the essentials needed to begin using TensorFlow. Each chapter zooms into a prominent example of machine learning. Readers can cover them all to master the basics or skip around to cater to their needs. By the end of this book, readers will be able to solve classification, clustering, regression, and prediction problems in the real world. KEY FEATURES • Lots of diagrams, code examples, and exercises • Solves real-world problems with TensorFlow • Uses well-studied neural network architectures • Presents code that can be used for the readers’ own applications AUDIENCE This book is for programmers who have some experience with Python and linear algebra concepts like vectors and matrices. No experience with machine learning is necessary. ABOUT THE TECHNOLOGY Google open-sourced their machine learning framework called TensorFlow in late 2015 under the Apache 2.0 license. Before that, it was used proprietarily by Google in its speech recognition, Search, Photos, and Gmail, among other applications. TensorFlow is one the most popular machine learning libraries.
£47.00
Elsevier Science & Technology Network Algorithmics
Book SynopsisTable of ContentsPart I: Rules of the Game 1. Introducing Network Algorithmics 2. Network Implementation Models 3. Fifteen Implementation Principles 4. Principles in Action Part II: Playing with Endnodes 5. Copying Data 6. Transferring Control 7. Maintaining Timers 8. Demultiplexing 9. Protocol Processing Part III: Playing with Routers 10. Exact-Match Lookups 11. Prefix-Match Lookups 12. Packet Classification 13. Switching 14. Scheduling Packets 15. Routers as Distributed Systems Part IV: Endgame 16. Measuring Network Traffic 17. Network Security 18. Conclusions Appendix: Detailed Models
£62.06
MIT Press Introduction to Algorithms fourth edition
Book Synopsis
£123.25
MIT Press Algorithms MIT Press Essential Knowledge
Book SynopsisAn accessible introduction to algorithms, explaining not just what they are but how they work, with examples from a wide range of application areas.Digital technology runs on algorithms, sets of instructions that describe how to do something efficiently. Application areas range from search engines to tournament scheduling, DNA sequencing, and machine learning. Arguing that every educated person today needs to have some understanding of algorithms and what they do, in this volume in the MIT Press Essential Knowledge series, Panos Louridas offers an introduction to algorithms that is accessible to the nonspecialist reader. Louridas explains not just what algorithms are but also how they work, offering a wide range of examples and keeping mathematics to a minimum.After discussing what an algorithm does and how its effectiveness can be measured, Louridas covers three of the most fundamental applications areas: graphs, which describe networks, from eighteenth-century proble
£15.19
MIT Press HumanCentered Data Science An Introduction
Book SynopsisBest practices for addressing the bias and inequality that may result from the automated collection, analysis, and distribution of large datasets.Human-centered data science is a new interdisciplinary field that draws from human-computer interaction, social science, statistics, and computational techniques. This book, written by founders of the field, introduces best practices for addressing the bias and inequality that may result from the automated collection, analysis, and distribution of very large datasets. It offers a brief and accessible overview of many common statistical and algorithmic data science techniques, explains human-centered approaches to data science problems, and presents practical guidelines and real-world case studies to help readers apply these methods. The authors explain how data scientists’ choices are involved at every stage of the data science workflow—and show how a human-centered approach can enhance each one, by mak
£31.35
Springer-Verlag New York Inc. Introduction to Cryptography
Book Synopsis1 Integers.- 2 Congruences and Residue Class Rings.- 3 Encryption.- 4 Probability and Perfect Secrecy.- 5 DES.- 6 AES.- 7 Prime Number Generation.- 8 Public-Key Encryption.- 9 Factoring.- 10 Discrete Logarithms.- 11 Cryptographic Hash Functions.- 12 Digital Signatures.- 13 Other Systems.- 14 Identification.- 15 Secret Sharing.- 16 Public-Key Infrastructures.- Solutions of the exercises.- References.Trade ReviewFrom the reviews: Zentralblatt Math "[......] Of the three books under review, Buchmann's is by far the most sophisticated, complete and up-to-date. It was written for computer-science majors - German ones at that - and might be rough going for all but the best American undergraduates. It is amazing how much Buchmann is able to do in under 300 pages: self-contained explanations of the relevant mathematics (with proofs); a systematic introduction to symmetric cryptosystems, including a detailed description and discussion of DES; a good treatment of primality testing, integer factorization, and algorithms for discrete logarithms, clearly written sections describing most of the major types of cryptosystems, and explanations of basic concepts of practical cryptography such as hash functions, message authentication codes, signatures, passwords, certification authorities, and certificate chains. This book is an excellent reference, and I believe that it would also be a good textbook for a course for mathematics or computer science majors, provided that the instructor is prepared to supplement it with more leisurely treatments of some of the topics." N. Koblitz (Seattle, WA) - American Math. Society Monthly. J.A. Buchmann Introduction to Cryptography "It gives a clear and systematic introduction into the subject whose popularity is ever increasing, and can be recommended to all who would like to learn about cryptography. The book contains many exercises and examples. It can be used as a textbook and is likely to become popular among students. The necessary definitions and concepts from algebra, number theory and probability theory are formulated, illustrated by examples and applied to cryptography." —ZENTRALBLATT MATH "For those of use who wish to learn more about cryptography and/or to teach it, Johannes Buchmann has written this book. … The book is mathematically complete and a satisfying read. There are plenty of homework exercises … . This is a good book for upperclassmen, graduate students, and faculty. … This book makes a superior reference and a fine textbook." (Robert W. Vallin, MathDL, January, 2001) "Buchmann’s book is a text on cryptography intended to be used at the undergraduate level. … the intended audiences of this book are ‘readers who want to learn about modern cryptographic algorithms and their mathematical foundations … . I enjoy reading this book. … Readers will find a good exposition of the techniques used in developing and analyzing these algorithms. … These make Buchmann’s text an excellent choice for self study or as a text for students … in elementary number theory and algebra." (Andrew C. Lee, SIGACT News, Vol. 34 (4), 2003) From the reviews of the second edition: "This is the english translation of the second edition of the author’s prominent german textbook ‘Einführung in die Kryptographie’. The original text grew out of several courses on cryptography given by the author at the Technical University Darmstadt; it is aimed at readers who want to learn about modern cryptographic techniques and its mathematical foundations … . As compared with the first edition the number of exercises has almost been doubled and some material … has been added." (R. Steinbauer, Monatshefte für Mathematik, Vol. 150 (4), 2007)Table of ContentsIntegers.- Congruences and Residue Class Rings.- Encryption.- Probability and Perfect Secrecy.- DES.- AES.- Prime Number Generation.- Public-Key Encryption.- Factoring.- Discrete Logarithms.- Cryptographic Hash Functions.- Digital Signatures.- Other Systems.- Identification.- Public-Key Infrastructures.- Solutions of the Odd Exercises.- Subject Index.- Bibliography.
£53.99
Taylor & Francis Ltd Grammars and Automata for String Processing From
Book SynopsisThe conventional wisdom was that biology influenced mathematics and computer science. But a new approach has taken hold: that of transferring methods and tools from computer science to biology. The reverse trend is evident in Grammars and Automata for String Processing: From Mathematics and Computer Science to Biology and Back. The contributors address the structural (syntactical) view of the domain. Mathematical linguistics and computer science can offer various tools for modeling complex macromolecules and for analyzing and simulating biological issues. This collection is valuable for students and researchers in biology, computer science, and applied mathematics.Table of ContentsLogistics, Languages and Combinatorics. Models of Molecular Computing.
£118.75
John Wiley & Sons Inc Algorithms for Image Processing and Computer
Book SynopsisProgrammers, scientists, and engineers are always in need of newer techniques and algorithms to manipulate and interpret images. Algorithms for Image Processing and Computer Vision is an accessible collection of algorithms for common image processing applications that simplifies complicated mathematical calculations.Table of ContentsPreface xxi Chapter 1 Practical Aspects of a Vision System — Image Display, Input/Output, and Library Calls 1 OpenCV 2 The Basic OpenCV Code 2 The IplImage Data Structure 3 Reading and Writing Images 6 Image Display 7 An Example 7 Image Capture 10 Interfacing with the AIPCV Library 14 Website Files 18 References 18 Chapter 2 Edge-Detection Techniques 21 The Purpose of Edge Detection 21 Traditional Approaches and Theory 23 Models of Edges 24 Noise 26 Derivative Operators 30 Template-Based Edge Detection 36 Edge Models: The Marr-Hildreth Edge Detector 39 The Canny Edge Detector 42 The Shen-Castan (ISEF) Edge Detector 48 A Comparison of Two Optimal Edge Detectors 51 Color Edges 53 Source Code for the Marr-Hildreth Edge Detector 58 Source Code for the Canny Edge Detector 62 Source Code for the Shen-Castan Edge Detector 70 Website Files 80 References 82 Chapter 3 Digital Morphology 85 Morphology Defined 85 Connectedness 86 Elements of Digital Morphology — Binary Operations 87 Binary Dilation 88 Implementing Binary Dilation 92 Binary Erosion 94 Implementation of Binary Erosion 100 Opening and Closing 101 MAX — A High-Level Programming Language for Morphology 107 The ‘‘Hit-and-Miss’’ Transform 113 Identifying Region Boundaries 116 Conditional Dilation 116 Counting Regions 119 Grey-Level Morphology 121 Opening and Closing 123 Smoothing 126 Gradient 128 Segmentation of Textures 129 Size Distribution of Objects 130 Color Morphology 131 Website Files 132 References 135 Chapter 4 Grey-Level Segmentation 137 Basics of Grey-Level Segmentation 137 Using Edge Pixels 139 Iterative Selection 140 The Method of Grey-Level Histograms 141 Using Entropy 142 Fuzzy Sets 146 Minimum Error Thresholding 148 Sample Results From Single Threshold Selection 149 The Use of Regional Thresholds 151 Chow and Kaneko 152 Modeling Illumination Using Edges 156 Implementation and Results 159 Comparisons 160 Relaxation Methods 161 Moving Averages 167 Cluster-Based Thresholds 170 Multiple Thresholds 171 Website Files 172 References 173 Chapter 5 Texture and Color 177 Texture and Segmentation 177 A Simple Analysis of Texture in Grey-Level Images 179 Grey-Level Co-Occurrence 182 Maximum Probability 185 Moments 185 Contrast 185 Homogeneity 185 Entropy 186 Results from the GLCM Descriptors 186 Speeding Up the Texture Operators 186 Edges and Texture 188 Energy and Texture 191 Surfaces and Texture 193 Vector Dispersion 193 Surface Curvature 195 Fractal Dimension 198 Color Segmentation 201 Color Textures 205 Website Files 205 References 206 Chapter 6 Thinning 209 What Is a Skeleton? 209 The Medial Axis Transform 210 Iterative Morphological Methods 212 The Use of Contours 221 Choi/Lam/Siu Algorithm 224 Treating the Object as a Polygon 226 Triangulation Methods 227 Force-Based Thinning 228 Definitions 229 Use of a Force Field 230 Subpixel Skeletons 234 Source Code for Zhang-Suen/Stentiford/Holt Combined Algorithm 235 Website Files 246 References 247 Chapter 7 Image Restoration 251 Image Degradations — The Real World 251 The Frequency Domain 253 The Fourier Transform 254 The Fast Fourier Transform 256 The Inverse Fourier Transform 260 Two-Dimensional Fourier Transforms 260 Fourier Transforms in OpenCV 262 Creating Artificial Blur 264 The Inverse Filter 270 The Wiener Filter 271 Structured Noise 273 Motion Blur — A Special Case 276 The Homomorphic Filter — Illumination 277 Frequency Filters in General 278 Isolating Illumination Effects 280 Website Files 281 References 283 Chapter 8 Classification 285 Objects, Patterns, and Statistics 285 Features and Regions 288 Training and Testing 292 Variation: In-Class and Out-Class 295 Minimum Distance Classifiers 299 Distance Metrics 300 Distances Between Features 302 Cross Validation 304 Support Vector Machines 306 Multiple Classifiers — Ensembles 309 Merging Multiple Methods 309 Merging Type 1 Responses 310 Evaluation 311 Converting Between Response Types 312 Merging Type 2 Responses 313 Merging Type 3 Responses 315 Bagging and Boosting 315 Bagging 315 Boosting 316 Website Files 317 References 318 Chapter 9 Symbol Recognition 321 The Problem 321 OCR on Simple Perfect Images 322 OCR on Scanned Images — Segmentation 326 Noise 327 Isolating Individual Glyphs 329 Matching Templates 333 Statistical Recognition 337 OCR on Fax Images — Printed Characters 339 Orientation — Skew Detection 340 The Use of Edges 345 Handprinted Characters 348 Properties of the Character Outline 349 Convex Deficiencies 353 Vector Templates 357 Neural Nets 363 A Simple Neural Net 364 A Backpropagation Net for Digit Recognition 368 The Use of Multiple Classifiers 372 Merging Multiple Methods 372 Results From the Multiple Classifier 375 Printed Music Recognition — A Study 375 Staff Lines 376 Segmentation 378 Music Symbol Recognition 381 Source Code for Neural Net Recognition System 383 Website Files 390 References 392 Chapter 10 Content-Based Search — Finding Images by Example 395 Searching Images 395 Maintaining Collections of Images 396 Features for Query by Example 399 Color Image Features 399 Mean Color 400 Color Quad Tree 400 Hue and Intensity Histograms 401 Comparing Histograms 402 Requantization 403 Results from Simple Color Features 404 Other Color-Based Methods 407 Grey-Level Image Features 408 Grey Histograms 409 Grey Sigma — Moments 409 Edge Density — Boundaries Between Objects 409 Edge Direction 410 Boolean Edge Density 410 Spatial Considerations 411 Overall Regions 411 Rectangular Regions 412 Angular Regions 412 Circular Regions 414 Hybrid Regions 414 Test of Spatial Sampling 414 Additional Considerations 417 Texture 418 Objects, Contours, Boundaries 418 Data Sets 418 Website Files 419 References 420 Systems 424 Chapter 11 High-Performance Computing for Vision and Image Processing 425 Paradigms for Multiple-Processor Computation 426 Shared Memory 426 Message Passing 427 Execution Timing 427 Using clock() 428 Using QueryPerformanceCounter 430 The Message-Passing Interface System 432 Installing MPI 432 Using MPI 433 Inter-Process Communication 434 Running MPI Programs 436 Real Image Computations 437 Using a Computer Network — Cluster Computing 440 A Shared Memory System — Using the PC Graphics Processor 444 GLSL 444 OpenGL Fundamentals 445 Practical Textures in OpenGL 448 Shader Programming Basics 451 Vertex and Fragment Shaders 452 Required GLSL Initializations 453 Reading and Converting the Image 454 Passing Parameters to Shader Programs 456 Putting It All Together 457 Speedup Using the GPU 459 Developing and Testing Shader Code 459 Finding the Needed Software 460 Website Files 461 References 461 Index 465
£71.10
John Wiley & Sons Inc Algorithmic Problem Solving
Book Synopsis* Novel approach to the mathematics of problem solving, in particular how to do logical calculations. * Many of the problems are well-known from (mathematical) puzzle books. * The solution method in the book is new and more relevant to the true nature of problem solving in the modern IT-dominated world.Table of ContentsPreface xi PART I Algorithmic Problem Solving 1 CHAPTER 1 – Introduction 3 1.1 Algorithms 3 1.2 Algorithmic Problem Solving 4 1.3 Overview 5 1.4 Bibliographic Remarks 6 CHAPTER 2 – Invariants 7 2.1 Chocolate Bars 10 2.1.1 The Solution 10 2.1.2 The Mathematical Solution 11 2.2 Empty Boxes 16 2.2.1 Review 19 2.3 The Tumbler Problem 22 2.3.1 Non-deterministic Choice 23 2.4 Tetrominoes 24 2.5 Summary 30 2.6 Bibliographic Remarks 34 CHAPTER 3 – Crossing a River 35 3.1 Problems 36 3.2 Brute Force 37 3.2.1 Goat, Cabbage and Wolf 37 3.2.2 State-Space Explosion 39 3.2.3 Abstraction 41 3.3 Nervous Couples 42 3.3.1 What Is the Problem? 42 3.3.2 Problem Structure 43 3.3.3 Denoting States and Transitions 44 3.3.4 Problem Decomposition 45 3.3.5 A Review 48 3.4 Rule of Sequential Composition 50 3.5 The Bridge Problem 54 3.6 Conditional Statements 63 3.7 Summary 65 3.8 Bibliographic Remarks 65 CHAPTER 4 – Games 67 4.1 Matchstick Games 67 4.2 Winning Strategies 69 4.2.1 Assumptions 69 4.2.2 Labelling Positions 70 4.2.3 Formulating Requirements 72 4.3 Subtraction-Set Games 74 4.4 Sums of Games 78 4.4.1 A Simple Sum Game 79 4.4.2 Maintain Symmetry! 81 4.4.3 More Simple Sums 82 4.4.4 Evaluating Positions 83 4.4.5 Using the Mex Function 87 4.5 Summary 91 4.6 Bibliographic Remarks 92 CHAPTER 5 – Knights and Knaves 95 5.1 Logic Puzzles 95 5.2 Calculational Logic 96 5.2.1 Propositions 96 5.2.2 Knights and Knaves 97 5.2.3 Boolean Equality 98 5.2.4 Hidden Treasures 100 5.2.5 Equals for Equals 101 5.3 Equivalence and Continued Equalities 102 5.3.1 Examples of the Associativity of Equivalence 104 5.3.2 On Natural Language 105 5.4 Negation 106 5.4.1 Contraposition 109 5.4.2 Handshake Problems 112 5.4.3 Inequivalence 113 5.5 Summary 117 5.6 Bibliographic Remarks 117 CHAPTER 6 – Induction 119 6.1 Example Problems 120 6.2 Cutting the Plane 123 6.3 Triominoes 126 6.4 Looking for Patterns 128 6.5 The Need for Proof 129 6.6 From Verification to Construction 130 6.7 Summary 134 6.8 Bibliographic Remarks 134 CHAPTER 7 – Fake-Coin Detection 137 7.1 Problem Formulation 137 7.2 Problem Solution 139 7.2.1 The Basis 139 7.2.2 Induction Step 139 7.2.3 The Marked-Coin Problem 140 7.2.4 The Complete Solution 141 7.3 Summary 146 7.4 Bibliographic Remarks 146 CHAPTER 8 – The Tower of Hanoi 147 8.1 Specification and Solution 147 8.1.1 The End of the World! 147 8.1.2 Iterative Solution 148 8.1.3 Why? 149 8.2 Inductive Solution 149 8.3 The Iterative Solution 153 8.4 Summary 156 8.5 Bibliographic Remarks 156 CHAPTER 9 – Principles of Algorithm Design 157 9.1 Iteration, Invariants and Making Progress 158 9.2 A Simple Sorting Problem 160 9.3 Binary Search 163 9.4 Sam Loyd’s Chicken-Chasing Problem 166 9.4.1 Cornering the Prey 170 9.4.2 Catching the Prey 174 9.4.3 Optimality 176 9.5 Projects 177 9.6 Summary 178 9.7 Bibliographic Remarks 180 CHAPTER 10 – The Bridge Problem 183 10.1 Lower and Upper Bounds 183 10.2 Outline Strategy 185 10.3 Regular Sequences 187 10.4 Sequencing Forward Trips 189 10.5 Choosing Settlers and Nomads 193 10.6 The Algorithm 196 10.7 Summary 199 10.8 Bibliographic Remarks 200 CHAPTER 11 – Knight’s Circuit 201 11.1 Straight-Move Circuits 202 11.2 Supersquares 206 11.3 Partitioning the Board 209 11.4 Summary 216 11.5 Bibliographic Remarks 218 PART II Mathematical Techniques 219 CHAPTER 12 – The Language of Mathematics 221 12.1 Variables, Expressions and Laws 222 12.2 Sets 224 12.2.1 The Membership Relation 224 12.2.2 The Empty Set 224 12.2.3 Types/Universes 224 12.2.4 Union and Intersection 225 12.2.5 Set Comprehension 225 12.2.6 Bags 227 12.3 Functions 227 12.3.1 Function Application 228 12.3.2 Binary Operators 230 12.3.3 Operator Precedence 230 12.4 Types and Type Checking 232 12.4.1 Cartesian Product and Disjoint Sum 233 12.4.2 Function Types 235 12.5 Algebraic Properties 236 12.5.1 Symmetry 237 12.5.2 Zero and Unit 238 12.5.3 Idempotence 239 12.5.4 Associativity 240 12.5.5 Distributivity/Factorisation 241 12.5.6 Algebras 243 12.6 Boolean Operators 244 12.7 Binary Relations 246 12.7.1 Reflexivity 247 12.7.2 Symmetry 248 12.7.3 Converse 249 12.7.4 Transitivity 249 12.7.5 Anti-symmetry 251 12.7.6 Orderings 252 12.7.7 Equality 255 12.7.8 Equivalence Relations 256 12.8 Calculations 257 12.8.1 Steps in a Calculation 259 12.8.2 Relations between Steps 260 12.8.3 ‘‘If’’ and ‘‘Only If’’ 262 12.9 Exercises 264 CHAPTER 13 – Boolean Algebra 267 13.1 Boolean Equality 267 13.2 Negation 269 13.3 Disjunction 270 13.4 Conjunction 271 13.5 Implication 274 13.5.1 Definitions and Basic Properties 275 13.5.2 Replacement Rules 276 13.6 Set Calculus 279 13.7 Exercises 281 CHAPTER 14 – Quantifiers 285 14.1 DotDotDot and Sigmas 285 14.2 Introducing Quantifier Notation 286 14.2.1 Summation 287 14.2.2 Free and Bound Variables 289 14.2.3 Properties of Summation 291 14.2.4 Warning 297 14.3 Universal and Existential Quantification 297 14.3.1 Universal Quantification 298 14.3.2 Existential Quantification 300 14.4 Quantifier Rules 301 14.4.1 The Notation 302 14.4.2 Free and Bound Variables 303 14.4.3 Dummies 303 14.4.4 Range Part 303 14.4.5 Trading 304 14.4.6 Term Part 304 14.4.7 Distributivity Properties 304 14.5 Exercises 306 CHAPTER 15 – Elements of Number Theory 309 15.1 Inequalities 309 15.2 Minimum and Maximum 312 15.3 The Divides Relation 315 15.4 Modular Arithmetic 316 15.4.1 Integer Division 316 15.4.2 Remainders and Modulo Arithmetic 320 15.5 Exercises 322 CHAPTER 16 – Relations, Graphs and Path Algebras 325 16.1 Paths in a Directed Graph 325 16.2 Graphs and Relations 328 16.2.1 Relation Composition 330 16.2.2 Union of Relations 332 16.2.3 Transitive Closure 334 16.2.4 Reflexive Transitive Closure 338 16.3 Functional and Total Relations 339 16.4 Path-Finding Problems 341 16.4.1 Counting Paths 341 16.4.2 Frequencies 343 16.4.3 Shortest Distances 344 16.4.4 All Paths 345 16.4.5 Semirings and Operations on Graphs 347 16.5 Matrices 351 16.6 Closure Operators 353 16.7 Acyclic Graphs 354 16.7.1 Topological Ordering 355 16.8 Combinatorics 357 16.8.1 Basic Laws 358 16.8.2 Counting Choices 359 16.8.3 Counting Paths 361 16.9 Exercises 366 Solutions to Exercises 369 References 405 Index 407
£39.56
John Wiley & Sons Inc Data Structures and Algorithms with
Book SynopsisAn object-oriented learning framework for creating good software design. Bruno Preiss presents readers with a modern, object-oriented perspective for looking at data structures and algorithms, clearly showing how to use polymorphism and inheritance, and including fragments from working and tested programs.Table of ContentsAlgorithm Analysis. Asymptotic Notation. Foundational Data Structures. Data Types and Abstraction. Stacks, Queues and Deques. Ordered Lists and Sorted Lists. Hashing, Hash Tables and Scatter Tables. Trees. Search Trees. Heaps and Priority Queues. Sets, Multisets and Partitions. Dynamic Storage Allocation. Algorithmic Patterns and Problem Solvers. Sorting Algorithms and Sorters. Graphs and Graph Algorithms. Appendices. Index.
£180.86
University of California Press The Feel of Algorithms
Book SynopsisTable of ContentsContents Preface Acknowledgments Introduction 1 Structures of Feeling in Algorithmic Culture 2 Coevolving with Algorithms 3 The Digital Geography of Fear 4 Friction in Algorithmic Relations 5 Care for Algorithmic Futures Ways Forward References Index
£21.25
Harvard University Press A History of Data Visualization and Graphic
Book SynopsisStatistical graphing was born in the seventeenth century as a scientific tool, but it quickly escaped all disciplinary bounds. Today graphics are ubiquitous in daily life. Michael Friendly and Howard Wainer detail the history of data visualization and argue that it has not only helped us solve problems, but it has also changed the way we think.Trade ReviewThe invention of graphs and charts was a much quieter affair than that of the telescope, but these tools have done just as much to change how and what we see. -- Hannah Fry * New Yorker *An indispensable account of how important practitioners of data visualizations write the history of their field. -- Crystal Lee * Information & Culture *We live in an era of data dependence—never before have graphic representations of data been as essential and sought after as at this moment…There has not been a publication of this scope on the evolution of graphic representation of qualitative and quantitative data since Funkhouser’s work…Scholars, practitioners, lovers of statistics and data visualization, and anyone interested in understanding the methods and techniques of today will benefit from understanding the innovations that brought us to where we are. -- María del Mar Navarro * Journal of Design, Economics, and Innovation *A thoughtful and well-written introduction to the world of data visualization and its history. -- Bill Satzer * MAA Reviews *An intellectually fascinating book…The audience for this book is wide. It would be useful to professionals and to professors in many departments such as psychology, sociology, economics, biology, physics, and any department that uses graphs to display quantitative information. It is a book to broaden your knowledge and offer interesting asides for lectures and meetings…Consult it frequently to learn of the stories of the developers of the many graphic methods we use today. -- Malcolm James Ree * Personnel Psychology *A marvel of research scholarship…This is the sort of book that one can browse and sample in bite-size chunks as the mood seizes, encountering curious delights while doing so. -- Bert Gunter * Significance *A masterly study of graphic innovations, their context, and their scientific use. This brilliant book, without equivalent, is an indispensable read. -- Gilles Palsky, coauthor of An Atlas of Geographical WondersFriendly and Wainer are the Watson and Crick of statistical graphics, showing us the history of the DNA structure that is the code of life for innovative visualizations. -- Ben Shneiderman, founder of the Human-Computer Interaction Lab, University of MarylandData expertise is a fundamental prerequisite for success in our digital age. But exactly how, and when, have we learned to draw conclusions from data? For decades, Michael Friendly and Howard Wainer have been studying how data has informed decision-making, through visualization and statistical analysis. Replete with mesmerizing visual examples, this book is an eye-opening distillation of their research. -- Sandra Rendgen, author of History of Information GraphicsMichael Friendly and Howard Wainer have given us a wonderful history of the dazzling field of data visualization. They bring new life to ancient death statistics and describe the artistic poetry used to display numbers. An intriguing story of how we have learned to communicate data of all types. -- Stephen M. Stigler, author of The Seven Pillars of Statistical WisdomTwo of the most distinguished scholars of data visualization give us a glimpse of ancient attempts to quantify the world, before revealing the century-long revolution that led to the invention of modern statistics and many of the graphical methods we use today. I learned a lot from this book, and I think you will too. -- Alberto Cairo, author of How Charts Lie: Getting Smarter about Visual InformationFriendly and Wainer demonstrate the amazing progress that has been made in data graphics over the past two hundred years. Understanding this history—where graphs came from and how they developed—will be valuable as we move forward. -- Andrew Gelman, coauthor of Regression and Other Stories
£28.86
Princeton University Press Googles PageRank and Beyond The Science of
Book SynopsisWhy doesn't your home page appear on the first page of search results, even when you query your own name? How do other web pages always appear at the top? What creates these powerful rankings? And how? This book about the science of web page rankings supplies the answers to these and other questions.Trade ReviewHonorable Mention for the 2006 Award for Best Professional/Scholarly Book in Computer & Information Science, Association of American Publishers "[F]or anyone who wants to delve deeply into just how Google's PageRank works, I recommend Google's PageRank and Beyond."--Stephen H. Wildstrom, BusinessWeek "This is a worthwhile book. It offers a comprehensive and erudite presentation of PageRank and related search-engine algorithms, and it is written in an approachable way, given the mathematical foundations involved."--Jonathan Bowen, Times Higher Education Supplement "This book should be at the top of anyone's list as a must-read for those interested in how search engines work and, more specifically how Google is to meet the needs of so many people in so many ways."--Michael W. Berry, SIAM Review "Amy N. Langville and Carl D. Meyer examine the logic, mathematics, and sophistication behind Google's PageRank and other Internet search engine ranking programs... It is an excellent work."--Ian D. Gordon, Library Journal "If I were taking, or teaching, a course in linear algebra today, this book would be a godsend."--Ed Gerstner, Nature Physics "Langville and Meyer present the mathematics in all its detail... But they vary the math with discussions of the many issues involved in building search engines, the 'wars' between search engine developers and those trying to artificially inflate the position of their pages, and the future of search-engine development... Google's PageRank and Beyond makes good reading for anyone, student or professional, who wants to understand the details of search engines."--James Hendler, Physics Today "This book is written for people who are curious about new science and technology as well as for those with more advanced background in matrix theory... Much of the book can be easily followed by general readers, while understanding the remaining part requires only a good first course in linear algebra. It can be a reference book for people who want to know more about the ideas behind the currently popular search engines, and it provides an introductory text for beginning researchers in the area of information retrieval."--Jiu Ding, Mathemathical Reviews "The book is very attractively and clearly written. The authors succeed to manage in an optimal way the presentation of both basic and more sophisticated concepts involved in the analysis of Google's PageRank, such that the book serves both audiences: the general and the technical scientific public."--Constantin Popa, Zentralblatt MATH "The book under review is excellently written, with a fresh and engaging style. The reader will particularly enjoy the 'Asides' interspersed throughout the text. They contain all kind of entertaining stories, practical tips, and amusing quotes... The book also contains some useful resources for computation."--Pablo Fernandez, Mathematical Intelligencer "Google's PageRank and Beyond describes the link analysis tool called PageRank, puts it in the context of web search engines and information retrieval, and describes competing methods for ranking webpages. It is an utterly engaging book."--Bill Satzer, MathDL.maa.orgTable of ContentsPreface ix Chapter 1: Introduction to Web Search Engines 1 1.1 A Short History of Information Retrieval 1 1.2 An Overview of Traditional Information Retrieval 5 1.3 Web Information Retrieval 9 Chapter 2: Crawling, Indexing, and Query Processing 15 2.1 Crawling 15 2.2 The Content Index 19 2.3 Query Processing 21 Chapter 3: Ranking Webpages by Popularity 25 3.1 The Scene in 1998 25 3.2 Two Theses 26 3.3 Query-Independence 30 Chapter 4: The Mathematics of Google's PageRank 31 4.1 The Original Summation Formula for PageRank 32 4.2 Matrix Representation of the Summation Equations 33 4.3 Problems with the Iterative Process 34 4.4 A Little Markov Chain Theory 36 4.5 Early Adjustments to the Basic Model 36 4.6 Computation of the PageRank Vector 39 4.7 Theorem and Proof for Spectrum of the Google Matrix 45 Chapter 5: Parameters in the PageRank Model 47 5.1 The alpha Factor 47 5.2 The Hyperlink Matrix H 48 5.3 The Teleportation Matrix E 49 Chapter 6: The Sensitivity of PageRank 57 6.1 Sensitivity with respect to alpha 57 6.2 Sensitivity with respect to H 62 6.3 Sensitivity with respect to vT 63 6.4 Other Analyses of Sensitivity 63 6.5 Sensitivity Theorems and Proofs 66 Chapter 7: The PageRank Problem as a Linear System 71 7.1 Properties of (I -- alphaS) 71 7.2 Properties of (I -- alphaH) 72 7.3 Proof of the PageRank Sparse Linear System 73 Chapter 8: Issues in Large-Scale Implementation of PageRank 75 8.1 Storage Issues 75 8.2 Convergence Criterion 79 8.3 Accuracy 79 8.4 Dangling Nodes 80 8.5 Back Button Modeling 84 Chapter 9: Accelerating the Computation of PageRank 89 9.1 An Adaptive Power Method 89 9.2 Extrapolation 90 9.3 Aggregation 94 9.4 Other Numerical Methods 97 Chapter 10: Updating the PageRank Vector 99 10.1 The Two Updating Problems and their History 100 10.2 Restarting the Power Method 101 10.3 Approximate Updating Using Approximate Aggregation 102 10.4 Exact Aggregation 104 10.5 Exact vs. Approximate Aggregation 105 10.6 Updating with Iterative Aggregation 107 10.7 Determining the Partition 109 10.8 Conclusions 111 Chapter 11: The HITS Method for Ranking Webpages 115 11.1 The HITS Algorithm 115 11.2 HITS Implementation 117 11.3 HITS Convergence 119 11.4 HITS Example 120 11.5 Strengths and Weaknesses of HITS 122 11.6 HITS's Relationship to Bibliometrics 123 11.7 Query-Independent HITS 124 11.8 Accelerating HITS 126 11.9 HITS Sensitivity 126 Chapter 12: Other Link Methods for Ranking Webpages 131 12.1 SALSA 131 12.2 Hybrid Ranking Methods 135 12.3 Rankings based on Traffic Flow 136 Chapter 13: The Future of Web Information Retrieval 139 13.1 Spam 139 13.2 Personalization 142 13.3 Clustering 142 13.4 Intelligent Agents 143 13.5 Trends and Time-Sensitive Search 144 13.6 Privacy and Censorship 146 13.7 Library Classification Schemes 147 13.8 Data Fusion 148 Chapter 14: Resources for Web Information Retrieval 149 14.1 Resources for Getting Started 149 14.2 Resources for Serious Study 150 Chapter 15: The Mathematics Guide 153 15.1 Linear Algebra 153 15.2 Perron-Frobenius Theory 167 15.3 Markov Chains 175 15.4 Perron Complementation 186 15.5 Stochastic Complementation 192 15.6 Censoring 194 15.7 Aggregation 195 15.8 Disaggregation 198 Chapter 16: Glossary 201 Bibliography 207 Index 219
£25.20
Princeton University Press The Inglorious Years
Book SynopsisHow populism is fueled by the demise of the industrial order and the emergence of a new digital society ruled by algorithmsIn the revolutionary excitement of the 1960s, young people around the world called for a radical shift away from the old industrial order, imagining a future of technological liberation and unfettered prosperity. IndustrialTrade Review"A welcome addition to the growing literature on the digital economy and change." * Choice *"Stimulating." * Paradigm Explorer *
£14.39
Pluto Press Data Power
Book SynopsisAn introduction to learning how to protect ourselves and organise against Big DataTrade Review'A call to arms [...] sets out a clear, persuasive argument for the need to challenge the power of platforms and systems, and details the tools to do so. A thought-provoking read' -- Prof. Rob Kitchin, Maynooth University‘The first non-technical guidebook on how to live with location data and it is a truly radical response for our times. Spatial data for us, not about us’ -- Jeremy W. Crampton, Professor of Urban Data Analysis, Newcastle University‘Brilliantly traces the closed loops of spatial data and suggests new escape routes, reminding us that our data can be remade to tell different stories’ -- Professor Kate Crawford, author of ‘Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence’'The book that I’ve long been waiting for, one that takes a material approach to the data geographies informing and being informed by technologies of everyday life’ -- Erin McElroy, Assistant Professor of American and Digital Studies at the University of Texas at Austin and cofounder of the Anti-Eviction Mapping Project'Data Power is an activist handbook wrapped in a theoretical treatise inside a media manifesto. The authors have a lively set of suggestions that provide a welcome antidote to the temptations of resignation and complacency' -- Mark Andrejevic, Professor in the School of Media, Film, and Journalism at Monash UniversityTable of ContentsList of Figures and Tables Series Preface Acknowledgments List of Abbreviations Introduction: Technology and the Axes of Hope and Fear 1. Life in the Age of Big Data 2. What Are Our Data, and What Are They Worth? 3. Existing Everyday Resistances 4. Contesting the Data Spectacle 5. Our Data Are Us, So Make Them Ours Epilogue Notes Bibliography Index
£17.99
Pluto Press Data Power Radical Geographies of Control and
Book SynopsisAn introduction to learning how to protect ourselves and organise against Big DataTrade Review'A call to arms [...] sets out a clear, persuasive argument for the need to challenge the power of platforms and systems, and details the tools to do so. A thought-provoking read' -- Prof. Rob Kitchin, Maynooth University‘The first non-technical guidebook on how to live with location data and it is a truly radical response for our times. Spatial data for us, not about us’ -- Jeremy W. Crampton, Professor of Urban Data Analysis, Newcastle University‘Brilliantly traces the closed loops of spatial data and suggests new escape routes, reminding us that our data can be remade to tell different stories’ -- Professor Kate Crawford, author of ‘Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence’'The book that I’ve long been waiting for, one that takes a material approach to the data geographies informing and being informed by technologies of everyday life’ -- Erin McElroy, Assistant Professor of American and Digital Studies at the University of Texas at Austin and cofounder of the Anti-Eviction Mapping Project'Data Power is an activist handbook wrapped in a theoretical treatise inside a media manifesto. The authors have a lively set of suggestions that provide a welcome antidote to the temptations of resignation and complacency' -- Mark Andrejevic, Professor in the School of Media, Film, and Journalism at Monash UniversityTable of ContentsList of Figures and Tables Series Preface Acknowledgments List of Abbreviations Introduction: Technology and the Axes of Hope and Fear 1. Life in the Age of Big Data 2. What Are Our Data, and What Are They Worth? 3. Existing Everyday Resistances 4. Contesting the Data Spectacle 5. Our Data Are Us, So Make Them Ours Epilogue Notes Bibliography Index
£68.00
Taylor & Francis Inc Multidimensional Discrete Unitary Transforms
Book SynopsisThis reference presents a more efficient, flexible, and manageable approach to unitary transform calculation and examines novel concepts in the design, classification, and management of fast algorithms for different transforms in one-, two-, and multidimensional cases. Illustrating methods to construct new unitary transforms for best algorithm selection and development in real-world applications, the book contains a wide range of examples to compare the efficacy of different algorithms in a variety of one-, two-, and three-dimensional cases. Multidimensional Discrete Unitary Transforms builds progressively from simple representative cases to higher levels of generalization.Table of ContentsSeries Introduction, Preface, 1: Basic Concepts and Notation, I: Tensor Representation of Multidimensional Signals, II: Analysis and effective computing procedures, III: Applications of Paired Transformations, Index
£256.50
John Wiley & Sons Inc Data Mining Algorithms
Book SynopsisData Mining Algorithms is a practical, technically-oriented guide to data mining algorithms that covers the most important algorithms for building classification, regression, and clustering models, as well as techniques used for attribute selection and transformation, model quality evaluation, and creating model ensembles. The author presents many of the important topics and methodologies widely used in data mining, whilst demonstrating the internal operation and usage of data mining algorithms using examples in R.Table of ContentsAcknowledgements xix Preface xxi References xxxi Part I Preliminaries 1 1 Tasks 3 1.1 Introduction 3 1.2 Inductive learning tasks 5 1.3 Classification 9 1.4 Regression 14 1.5 Clustering 16 1.6 Practical issues 19 1.7 Conclusion 20 1.8 Further readings 21 References 22 2 Basic statistics 23 2.1 Introduction 23 2.2 Notational conventions 24 2.3 Basic statistics as modeling 24 2.4 Distribution description 25 2.5 Relationship detection 47 2.6 Visualization 62 2.7 Conclusion 65 2.8 Further readings 66 References 67 Part II Classification 69 3 Decision trees 71 3.1 Introduction 71 3.2 Decision tree model 72 3.3 Growing 76 3.4 Pruning 90 3.5 Prediction 103 3.6 Weighted instances 105 3.7 Missing value handling 106 3.8 Conclusion 114 3.9 Further readings 114 References 116 4 Naïve Bayes classifier 118 4.1 Introduction 118 4.2 Bayes rule 118 4.3 Classification by Bayesian inference 120 4.4 Practical issues 125 4.5 Conclusion 131 4.6 Further readings 131 References 132 5 Linear classification 134 5.1 Introduction 134 5.2 Linear representation 136 5.3 Parameter estimation 145 5.4 Discrete attributes 154 5.5 Conclusion 155 5.6 Further readings 156 References 157 6 Misclassification costs 159 6.1 Introduction 159 6.2 Cost representation 161 6.3 Incorporating misclassification costs 164 6.4 Effects of cost incorporation 176 6.5 Experimental procedure 180 6.6 Conclusion 184 6.7 Further readings 185 References 187 7 Classification model evaluation 189 7.1 Introduction 189 7.2 Performance measures 190 7.3 Evaluation procedures 213 7.4 Conclusion 231 7.5 Further readings 232 References 233 Part III Regression 235 8 Linear regression 237 8.1 Introduction 237 8.2 Linear representation 238 8.3 Parameter estimation 242 8.4 Discrete attributes 250 8.5 Advantages of linear models 251 8.6 Beyond linearity 252 8.7 Conclusion 258 8.8 Further readings 258 References 259 9 Regression trees 261 9.1 Introduction 261 9.2 Regression tree model 262 9.3 Growing 263 9.4 Pruning 274 9.5 Prediction 277 9.6 Weighted instances 278 9.7 Missing value handling 279 9.8 Piecewise linear regression 284 9.9 Conclusion 292 9.10 Further readings 292 References 293 10 Regression model evaluation 295 10.1 Introduction 295 10.2 Performance measures 296 10.3 Evaluation procedures 303 10.4 Conclusion 309 10.5 Further readings 309 References 310 Part IV Clustering 311 11 (Dis)similarity measures 313 11.1 Introduction 313 11.2 Measuring dissimilarity and similarity 313 11.3 Difference-based dissimilarity 314 11.4 Correlation-based similarity 321 11.5 Missing attribute values 324 11.6 Conclusion 325 11.7 Further readings 325 References 326 12 k-Centers clustering 328 12.1 Introduction 328 12.2 Algorithm scheme 330 12.3 k-Means 334 12.4 Beyond means 338 12.5 Beyond (fixed) k 342 12.6 Explicit cluster modeling 343 12.7 Conclusion 345 12.8 Further readings 345 References 347 13 Hierarchical clustering 349 13.1 Introduction 349 13.2 Cluster hierarchies 351 13.3 Agglomerative clustering 353 13.4 Divisive clustering 361 13.5 Hierarchical clustering visualization 364 13.6 Hierarchical clustering prediction 366 13.7 Conclusion 369 13.8 Further readings 370 References 371 14 Clustering model evaluation 373 14.1 Introduction 373 14.2 Per-cluster quality measures 376 14.3 Overall quality measures 385 14.4 External quality measures 393 14.5 Using quality measures 397 14.6 Conclusion 398 14.7 Further readings 398 References 399 Part V Getting Better Models 401 15 Model ensembles 403 15.1 Introduction 403 15.2 Model committees 404 15.3 Base models 406 15.4 Model aggregation 420 15.5 Specific ensemble modeling algorithms 431 15.6 Quality of ensemble predictions 448 15.7 Conclusion 449 15.8 Further readings 450 References 451 16 Kernel methods 454 16.1 Introduction 454 16.2 Support vector machines 457 16.3 Support vector regression 473 16.4 Kernel trick 482 16.5 Kernel functions 484 16.6 Kernel prediction 487 16.7 Kernel-based algorithms 489 16.8 Conclusion 494 16.9 Further readings 495 References 496 17 Attribute transformation 498 17.1 Introduction 498 17.2 Attribute transformation task 499 17.3 Simple transformations 504 17.4 Multiclass encoding 510 17.5 Conclusion 521 17.6 Further readings 521 References 522 18 Discretization 524 18.1 Introduction 524 18.2 Discretization task 525 18.3 Unsupervised discretization 530 18.4 Supervised discretization 533 18.5 Effects of discretization 551 18.6 Conclusion 553 18.7 Further readings 553 References 556 19 Attribute selection 558 19.1 Introduction 558 19.2 Attribute selection task 559 19.3 Attribute subset search 562 19.4 Attribute selection filters 568 19.5 Attribute selection wrappers 588 19.6 Effects of attribute selection 593 19.7 Conclusion 598 19.8 Further readings 599 References 600 20 Case studies 602 20.1 Introduction 602 20.2 Census income 605 20.3 Communities and crime 631 20.4 Cover type 640 20.5 Conclusion 654 20.6 Further readings 655 References 655 Closing 657 A Notation 659 A.1 Attribute values 659 A.2 Data subsets 659 A.3 Probabilities 660 B R packages 661 B.1 CRAN packages 661 B.2 DMR packages 662 B.3 Installing packages 663 References 664 C Datasets 666 Index 667
£56.66
John Wiley & Sons Inc Evolutionary Algorithms for Mobile Ad Hoc
Book SynopsisThis comprehensive guide describes how evolutionary algorithms (EA) may be used to identify, model, and optimize day-to-day problems that arise for researchers in optimization and mobile networking.Table of ContentsPreface xiii PART I BASIC CONCEPTS AND LITERATURE REVIEW 1 1 INTRODUCTION TO MOBILE AD HOC NETWORKS 3 1.1 Mobile Ad Hoc Networks 6 1.2 Vehicular Ad Hoc Networks 9 1.2.1 Wireless Access in Vehicular Environment (WAVE) 11 1.2.2 Communication Access for Land Mobiles (CALM) 12 1.2.3 C2C Network 13 1.3 Sensor Networks 14 1.3.1 IEEE 1451 17 1.3.2 IEEE 802.15.4 17 1.3.3 ZigBee 18 1.3.4 6LoWPAN 19 1.3.5 Bluetooth 19 1.3.6 Wireless Industrial Automation System 20 1.4 Conclusion 20 References 21 2 INTRODUCTION TO EVOLUTIONARY ALGORITHMS 27 2.1 Optimization Basics 28 2.2 Evolutionary Algorithms 29 2.3 Basic Components of Evolutionary Algorithms 32 2.3.1 Representation 32 2.3.2 Fitness Function 32 2.3.3 Selection 32 2.3.4 Crossover 33 2.3.5 Mutation 34 2.3.6 Replacement 35 2.3.7 Elitism 35 2.3.8 Stopping Criteria 35 2.4 Panmictic Evolutionary Algorithms 36 2.4.1 Generational EA 36 2.4.2 Steady-State EA 36 2.5 Evolutionary Algorithms with Structured Populations 36 2.5.1 Cellular EAs 37 2.5.2 Cooperative Coevolutionary EAs 38 2.6 Multi-Objective Evolutionary Algorithms 39 2.6.1 Basic Concepts in Multi-Objective Optimization 40 2.6.2 Hierarchical Multi-Objective Problem Optimization 42 2.6.3 Simultaneous Multi-Objective Problem Optimization 43 2.7 Conclusion 44 References 45 3 SURVEY ON OPTIMIZATION PROBLEMS FOR MOBILE AD HOC NETWORKS 49 3.1 Taxonomy of the Optimization Process 51 3.1.1 Online and Offline Techniques 51 3.1.2 Using Global or Local Knowledge 52 3.1.3 Centralized and Decentralized Systems 52 3.2 State of the Art 53 3.2.1 Topology Management 53 3.2.2 Broadcasting Algorithms 58 3.2.3 Routing Protocols 59 3.2.4 Clustering Approaches 63 3.2.5 Protocol Optimization 64 3.2.6 Modeling the Mobility of Nodes 65 3.2.7 Selfish Behaviors 66 3.2.8 Security Issues 67 3.2.9 Other Applications 67 3.3 Conclusion 68 References 69 4 MOBILE NETWORKS SIMULATION 79 4.1 Signal Propagation Modeling 80 4.1.1 Physical Phenomena 81 4.1.2 Signal Propagation Models 85 4.2 State of the Art of Network Simulators 89 4.2.1 Simulators 89 4.2.2 Analysis 92 4.3 Mobility Simulation 93 4.3.1 Mobility Models 93 4.3.2 State of the Art of Mobility Simulators 96 4.4 Conclusion 98 References 98 PART II PROBLEMS OPTIMIZATION 105 5 PROPOSED OPTIMIZATION FRAMEWORK 107 5.1 Architecture 108 5.2 Optimization Algorithms 110 5.2.1 Single-Objective Algorithms 110 5.2.2 Multi-Objective Algorithms 115 5.3 Simulators 121 5.3.1 Network Simulator: ns-3 121 5.3.2 Mobility Simulator: SUMO 123 5.3.3 Graph-Based Simulations 126 5.4 Experimental Setup 127 5.5 Conclusion 131 References 131 6 BROADCASTING PROTOCOL 135 6.1 The Problem 136 6.1.1 DFCN Protocol 136 6.1.2 Optimization Problem Definition 138 6.2 Experiments 140 6.2.1 Algorithm Configurations 140 6.2.2 Comparison of the Performance of the Algorithms 141 6.3 Analysis of Results 142 6.3.1 Building a Representative Subset of Best Solutions 143 6.3.2 Interpretation of the Results 145 6.3.3 Selected Improved DFCN Configurations 148 6.4 Conclusion 150 References 151 7 ENERGY MANAGEMENT 153 7.1 The Problem 154 7.1.1 AEDB Protocol 154 7.1.2 Optimization Problem Definition 156 7.2 Experiments 159 7.2.1 Algorithm Configurations 159 7.2.2 Comparison of the Performance of the Algorithms 160 7.3 Analysis of Results 161 7.4 Selecting Solutions from the Pareto Front 164 7.4.1 Performance of the Selected Solutions 167 7.5 Conclusion 170 References 171 8 NETWORK TOPOLOGY 173 8.1 The Problem 175 8.1.1 Injection Networks 175 8.1.2 Optimization Problem Definition 176 8.2 Heuristics 178 8.2.1 Centralized 178 8.2.2 Distributed 179 8.3 Experiments 180 8.3.1 Algorithm Configurations 180 8.3.2 Comparison of the Performance of the Algorithms 180 8.4 Analysis of Results 183 8.4.1 Analysis of the Objective Values 183 8.4.2 Comparison with Heuristics 185 8.5 Conclusion 187 References 188 9 REALISTIC VEHICULAR MOBILITY 191 9.1 The Problem 192 9.1.1 Vehicular Mobility Model 192 9.1.2 Optimization Problem Definition 196 9.2 Experiments 199 9.2.1 Algorithms Configuration 199 9.2.2 Comparison of the Performance of the Algorithms 200 9.3 Analysis of Results 202 9.3.1 Analysis of the Decision Variables 202 9.3.2 Analysis of the Objective Values 204 9.4 Conclusion 206 References 206 10 SUMMARY AND DISCUSSION 209 10.1 A New Methodology for Optimization in Mobile Ad Hoc Networks 211 10.2 Performance of the Three Algorithmic Proposals 213 10.2.1 Broadcasting Protocol 213 10.2.2 Energy-Efficient Communications 214 10.2.3 Network Connectivity 214 10.2.4 Vehicular Mobility 215 10.3 Global Discussion on the Performance of the Algorithms 215 10.3.1 Single-Objective Case 216 10.3.2 Multi-Objective Case 217 10.4 Conclusion 218 References 218 INDEX 221
£86.36
John Wiley & Sons Inc MetaAlgorithmics
Book SynopsisThe confluence of cloud computing, parallelism and advanced machine intelligence approaches has created a world in which the optimum knowledge system will usually be architected from the combination of two or more knowledge-generating systems. There is a need, then, to provide a reusable, broadly-applicable set of design patterns to empower the intelligent system architect to take advantage of this opportunity. This book explains how to design and build intelligent systems that are optimized for changing system requirements (adaptability), optimized for changing system input (robustness), and optimized for one or more other important system parameters (e.g., accuracy, efficiency, cost). It provides an overview of traditional parallel processing which is shown to consist primarily of task and component parallelism; before introducing meta-algorithmic parallelism which is based on combining two or more algorithms, classification engines or other systems. Key features:Table of Contents1 Introduction and Overview 1 1.1 Introduction 1 1.2 Why Is This Book Important? 2 1.3 Organization of the Book 3 1.4 Informatics 4 1.5 Ensemble Learning 6 1.6 Machine Learning/Intelligence 7 1.7 Artificial Intelligence 22 1.8 Data Mining/Knowledge Discovery 31 1.9 Classification 32 1.10 Recognition 38 1.11 System-Based Analysis 39 1.12 Summary 39 References 40 2 Parallel Forms of Parallelism 42 2.1 Introduction 42 2.2 Parallelism by Task 43 2.3 Parallelism by Component 52 2.4 Parallelism by Meta-algorithm 64 2.5 Summary 71 References 72 3 Domain Areas: Where Is This Relevant? 73 3.1 Introduction 73 3.2 Overview of the Domains 74 3.3 Primary Domains 75 3.4 Secondary Domains 86 3.5 Summary 101 References 102 4 Applications of Parallelism by Task 104 4.1 Introduction 104 4.2 Primary Domains 105 4.3 Summary 135 References 136 5 Application of Parallelism by Component 137 5.1 Introduction 137 5.2 Primary Domains 138 5.3 Summary 172 References 173 6 Introduction to Meta-algorithmics 175 6.1 Introduction 175 6.2 First-Order Meta-algorithmics 178 6.3 Second-Order Meta-algorithmics 195 6.4 Third-Order Meta-algorithmics 218 6.5 Summary 240 References 240 7 First-Order Meta-algorithmics and Their Applications 241 7.1 Introduction 241 7.2 First-Order Meta-algorithmics and the “Black Box” 241 7.3 Primary Domains 242 7.4 Secondary Domains 257 7.5 Summary 271 References 271 8 Second-Order Meta-algorithmics and Their Applications 272 8.1 Introduction 272 8.2 Second-Order Meta-algorithmics and Targeting the “Fringes” 273 8.3 Primary Domains 279 8.4 Secondary Domains 304 8.5 Summary 308 References 308 9 Third-Order Meta-algorithmics and Their Applications 310 9.1 Introduction 310 9.2 Third-Order Meta-algorithmic Patterns 311 9.3 Primary Domains 313 9.4 Secondary Domains 328 9.5 Summary 340 References 341 10 Building More Robust Systems 342 10.1 Introduction 342 10.2 Summarization 342 10.3 Cloud Systems 350 10.4 Mobile Systems 353 10.5 Scheduling 355 10.6 Classification 356 10.7 Summary 358 Reference 359 11 The Future 360 11.1 Recapitulation 360 11.2 The Pattern of all Patience 362 11.3 Beyond the Pale 365 11.4 Coming Soon 367 11.5 Summary 368 References 368 Index
£77.36
John Wiley & Sons Inc Algorithms and Networking for Computer Games
Book SynopsisThe essential guide to solving algorithmic and networking problems in commercial computer games, revised and extended Algorithms and Networking for Computer Games, Second Editionis written from the perspective of the computer scientist. Combining algorithmic knowledge and game-related problems, it explores the most common problems encountered in game programing. The first part of the book presents practical algorithms for solving classical topics, such as random numbers, procedural generation, tournaments, group formations and game trees. The authors also focus on how to find a path in, create the terrain of, and make decisions in the game world. The second part introduces networking related problems in computer games, focusing on four key questions: how to hide the inherent communication delay, how to best exploit limited network resources, how to cope with cheating and how to measure the on-line game data. Thoroughly revised, updated, and Trade Review“More than 70 algorithms are presented, covering random numbers, noise in data (a realistic world is full of imperfections), procedural generation, tournaments, game trees, path finding, group movement, decision making, and modelling uncertainty – as well as networking problems, including dealing with cheating. The exercises at the end of each chapter range from simple thought exercises to studying Braben and Bell’s namegeneration algorithm from Elite (1984) … use of pseudocode throughout ensures the book works equally well for C, C++, Java, Python, or even C# programmers.” MagPi, Issue 64, December 2017 Table of ContentsPreface xiii 1 Introduction 1 1.1 Anatomy of Computer Games 4 1.2 Game Development 6 1.2.1 Phases of development 7 1.2.2 Documentation 8 1.2.3 Other considerations 11 1.3 Synthetic Players 12 1.3.1 Humanness 13 1.3.2 Stance 14 1.4 Multiplaying 14 1.5 Interactive Storytelling 15 1.5.1 Approaches 16 1.5.2 Storytelling in games 17 1.6 Outline of the Book 19 1.6.1 Algorithms 20 1.6.2 Networking 20 1.7 Summary 21 Exercises 21 I Algorithms 25 2 Random Numbers 26 2.1 Linear Congruential Method 27 2.1.1 Choice of parameters 30 2.1.2 Testing the randomness 32 2.1.3 Using the generators 33 2.2 Discrete Finite Distributions 36 2.3 Random Shuffling 40 2.4 Summary 44 Exercises 44 3 Noise 49 3.1 Applying Noise 50 3.2 Origin of Noise 51 3.3 Visualization 52 3.4 Interpolation 55 3.4.1 Utility routines for value conversions 56 3.4.2 Interpolation in a single parameter 58 3.4.3 Interpolation in two parameters 61 3.5 Composition of Noise 62 3.6 Periodic Noise 65 3.7 Perlin Noise 68 3.8 Worley Noise 73 3.9 Summary 83 Exercises 83 4 Procedural Generation 88 4.1 Terrain Generation 89 4.2 Maze Algorithms 96 4.2.1 Depth-first algorithm 98 4.2.2 Randomized Kruskal’s algorithm 99 4.2.3 Randomized Prim’s algorithm 101 4.3 L-Systems 101 4.3.1 Examples 103 4.3.2 City generation 105 4.4 Hierarchical Universe Generation 108 4.5 Summary 109 Exercises 111 5 Tournaments 115 5.1 Rank Adjustment Tournaments 118 5.2 Elimination Tournaments 123 5.3 Scoring Tournaments 131 5.4 Summary 135 Exercises 138 6 Game Trees 143 6.1 Minimax 144 6.1.1 Analysis 147 6.1.2 Partial minimax 148 6.2 Alpha-Beta Pruning 152 6.2.1 Analysis 156 6.2.2 Principal variation search 157 6.3 Monte Carlo Tree Search 157 6.4 Games of Chance 166 6.5 Summary 168 Exercises 170 7 Path Finding 177 7.1 Discretization of the Game World 178 7.1.1 Grid 179 7.1.2 Navigation mesh 180 7.2 Finding the Minimum Path 182 7.2.1 Evaluation function 183 7.2.2 Properties 184 7.2.3 Algorithm A* 185 7.3 Realizing the Movement 187 7.4 Summary 189 Exercises 190 8 Group Movement 194 8.1 Flocking 195 8.2 Formations 200 8.2.1 Coordinating formations 200 8.2.2 Behaviour-based steering 204 8.2.3 Fuzzy logic control 205 8.2.4 Mass-spring systems 207 8.3 Summary 208 Exercises 208 9 Decision-Making 211 9.1 Background 211 9.1.1 Levels of decision-making 212 9.1.2 Modelled knowledge 213 9.1.3 Methods 214 9.2 Finite State Machines 218 9.2.1 Computational FSM 221 9.2.2 Mealy and Moore machines 224 9.2.3 Implementation 227 9.2.4 Discussion 228 9.3 Influence Maps 231 9.4 Automated Planning 235 9.5 Summary 237 Exercises 240 10 Modelling Uncertainty 246 10.1 Statistical Reasoning 246 10.1.1 Bayes’ theorem 246 10.1.2 Bayesian networks 248 10.1.3 Dempster–Shafer theory 249 10.2 Fuzzy Sets 252 10.2.1 Membership function 253 10.2.2 Fuzzy operations 255 10.2.3 Defuzzification 255 10.3 Fuzzy Constraint Satisfaction Problem 257 10.3.1 Modelling the criteria as fuzzy sets 259 10.3.2 Weighting the criteria importances 262 10.3.3 Aggregating the criteria 262 10.3.4 Making a decision 263 10.4 Summary 263 Exercises 265 II Networking 268 11 Communication Layers 269 11.1 Physical Platform 270 11.1.1 Resource limitations 271 11.1.2 Transmission techniques and protocols 272 11.2 Logical Platform 274 11.2.1 Communication architecture 274 11.2.2 Data and control architecture 275 11.3 Networked Application 277 11.4 Summary 278 Exercises 278 12 Compensating Resource Limitations 283 12.1 Aspects of Compensation 284 12.1.1 Consistency and responsiveness 284 12.1.2 Scalability 287 12.2 Protocol Optimization 291 12.2.1 Message compression 291 12.2.2 Message aggregation 292 12.3 Dead Reckoning 293 12.3.1 Prediction 293 12.3.2 Convergence 295 12.4 Local Perception Filters 297 12.4.1 Linear temporal contour 301 12.4.2 Adding bullet time to the delays 305 12.5 Synchronized Simulation 307 12.6 Interest Management 308 12.6.1 Aura-based interest management 310 12.6.2 Zone-based interest management 310 12.6.3 Visibility-based interest management 312 12.6.4 Class-based interest management 312 12.7 Compensation by Game Design 314 12.7.1 Short active turns 314 12.7.2 Semi-autonomous avatars 315 12.7.3 Interaction via proxies 316 12.8 Summary 317 Exercises 318 13 Cheating Prevention 321 13.1 Technical Exploitations 322 13.1.1 Packet tampering 323 13.1.2 Look-ahead cheating 324 13.1.3 Cracking and other attacks 330 13.2 Collusion 331 13.2.1 Classification 333 13.2.2 Collusion detection 335 13.3 Rule Violations 337 13.4 Summary 338 Exercises 338 14 Online Metrics 341 14.1 Players 344 14.2 Monetization 345 14.3 Acquisition 347 14.4 Game Session 347 14.5 Summary 348 Exercises 348 A Pseudocode Conventions 351 A.1 Changing the Flow of Control 355 A.1.1 Expressions 355 A.1.2 Control structures 357 A.2 Data Structures 360 A.2.1 Values and entities 360 A.2.2 Data collections 360 A.3 Format of Algorithms 365 A.4 Conversion to Existing Programming Languages 367 B Practical Vectors and Matrices 371 B.1 Points and Vectors 372 B.2 Matrices 381 B.3 Conclusion 387 Bibliography 391 Ludography 408 Index 409
£62.06
John Wiley & Sons Inc Recent Advances in Hybrid Metaheuristics for Data
Book SynopsisAn authoritative guide to an in-depth analysis of various state-of-the-art data clustering approaches using a range of computational intelligence techniques Recent Advances in Hybrid Metaheuristics for Data Clustering offers a guide to the fundamentals of various metaheuristics and their application to data clustering. Metaheuristics are designed to tackle complex clustering problems where classical clustering algorithms have failed to be either effective or efficient. The authors?noted experts on the topic?provide a text that can aid in the design and development of hybrid metaheuristics to be applied to data clustering. The book includes performance analysis of the hybrid metaheuristics in relationship to their conventional counterparts. In addition to providing a review of data clustering, the authors include in-depth analysis of different optimization algorithms. The text offers a step-by-step guide in the build-up of hybrid metaheuristics and to enhance comprehension. In addition,Table of ContentsList of Contributors xiii Series Preface xv Preface xvii 1 Metaheuristic Algorithms in Fuzzy Clustering 1Sourav De, Sandip Dey, and Siddhartha Bhattacharyya 1.1 Introduction 1 1.2 Fuzzy Clustering 1 1.2.1 Fuzzy c-means (FCM) clustering 2 1.3 Algorithm 2 1.3.1 Selection of Cluster Centers 3 1.4 Genetic Algorithm 3 1.5 Particle Swarm Optimization 5 1.6 Ant Colony Optimization 6 1.7 Artificial Bee Colony Algorithm 7 1.8 Local Search-Based Metaheuristic Clustering Algorithms 7 1.9 Population-Based Metaheuristic Clustering Algorithms 8 1.9.1 GA-Based Fuzzy Clustering 8 1.9.2 PSO-Based Fuzzy Clustering 9 1.9.3 Ant Colony Optimization–Based Fuzzy Clustering 10 1.9.4 Artificial Bee Colony Optimization–Based Fuzzy Clustering 10 1.9.5 Differential Evolution–Based Fuzzy Clustering 11 1.9.6 Firefly Algorithm–Based Fuzzy Clustering 12 1.10 Conclusion 13 References 13 2 Hybrid Harmony Search Algorithm to Solve the Feature Selection for Data Mining Applications 19Laith Mohammad Abualigah, Mofleh Al-diabat, Mohammad Al Shinwan, Khaldoon Dhou, Bisan Alsalibi, Essam Said Hanandeh, and Mohammad Shehab 2.1 Introduction 19 2.2 Research Framework 21 2.3 Text Preprocessing 22 2.3.1 Tokenization 22 2.3.2 StopWords Removal 22 2.3.3 Stemming 23 2.3.4 Text Document Representation 23 2.3.5 TermWeight (TF-IDF) 23 2.4 Text Feature Selection 24 2.4.1 Mathematical Model of the Feature Selection Problem 24 2.4.2 Solution Representation 24 2.4.3 Fitness Function 24 2.5 Harmony Search Algorithm 25 2.5.1 Parameters Initialization 25 2.5.2 Harmony Memory Initialization 26 2.5.3 Generating a New Solution 26 2.5.4 Update Harmony Memory 27 2.5.5 Check the Stopping Criterion 27 2.6 Text Clustering 27 2.6.1 Mathematical Model of the Text Clustering 27 2.6.2 Find Clusters Centroid 27 2.6.3 Similarity Measure 28 2.7 k-means text clustering algorithm 28 2.8 Experimental Results 29 2.8.1 Evaluation Measures 29 2.8.1.1 F-measure Based on Clustering Evaluation 30 2.8.1.2 Accuracy Based on Clustering Evaluation 31 2.8.2 Results and Discussions 31 2.9 Conclusion 34 References 34 3 Adaptive Position–Based Crossover in the Genetic Algorithm for Data Clustering 39Arnab Gain and Prasenjit Dey 3.1 Introduction 39 3.2 Preliminaries 40 3.2.1 Clustering 40 3.2.1.1 k-means Clustering 40 3.2.2 Genetic Algorithm 41 3.3 RelatedWorks 42 3.3.1 GA-Based Data Clustering by Binary Encoding 42 3.3.2 GA-Based Data Clustering by Real Encoding 43 3.3.3 GA-Based Data Clustering for Imbalanced Datasets 44 3.4 Proposed Model 44 3.5 Experimentation 46 3.5.1 Experimental Settings 46 3.5.2 DB Index 47 3.5.3 Experimental Results 49 3.6 Conclusion 51 References 57 4 Application of Machine Learning in the Social Network 61Belfin R. V., E. Grace Mary Kanaga, and Suman Kundu 4.1 Introduction 61 4.1.1 Social Media 61 4.1.2 Big Data 62 4.1.3 Machine Learning 62 4.1.4 Natural Language Processing (NLP) 63 4.1.5 Social Network Analysis 64 4.2 Application of Classification Models in Social Networks 64 4.2.1 Spam Content Detection 65 4.2.2 Topic Modeling and Labeling 65 4.2.3 Human Behavior Analysis 67 4.2.4 Sentiment Analysis 68 4.3 Application of Clustering Models in Social Networks 68 4.3.1 Recommender Systems 69 4.3.2 Sentiment Analysis 70 4.3.3 Information Spreading or Promotion 70 4.3.4 Geolocation-Specific Applications 70 4.4 Application of Regression Models in Social Networks 71 4.4.1 Social Network and Human Behavior 71 4.4.2 Emotion Contagion through Social Networks 73 4.4.3 Recommender Systems in Social Networks 74 4.5 Application of Evolutionary Computing and Deep Learning in Social Networks 74 4.5.1 Evolutionary Computing and Social Network 75 4.5.2 Deep Learning and Social Networks 75 4.6 Summary 76 Acknowledgments 77 References 78 5 Predicting Students’ Grades Using CART, ID3, and Multiclass SVM Optimized by the Genetic Algorithm (GA): A Case Study 85Debanjan Konar, Ruchita Pradhan, Tania Dey, Tejaswini Sapkota, and Prativa Rai 5.1 Introduction 85 5.2 Literature Review 87 5.3 Decision Tree Algorithms: ID3 and CART 88 5.4 Multiclass Support Vector Machines (SVMs) Optimized by the Genetic Algorithm (GA) 90 5.4.1 Genetic Algorithms for SVM Model Selection 92 5.5 Preparation of Datasets 93 5.6 Experimental Results and Discussions 95 5.7 Conclusion 96 References 96 6 Cluster Analysis of Health Care Data Using Hybrid Nature-Inspired Algorithms 101Kauser Ahmed P, Rishabh Agrawal 6.1 Introduction 101 6.2 RelatedWork 102 6.2.1 Firefly Algorithm 102 6.2.2 k-means Algorithm 103 6.3 Proposed Methodology 104 6.4 Results and Discussion 106 6.5 Conclusion 110 References 111 7 Performance Analysis Through a Metaheuristic Knowledge Engine 113Indu Chhabra and Gunmala Suri 7.1 Introduction 113 7.2 Data Mining and Metaheuristics 114 7.3 Problem Description 115 7.4 Association Rule Learning 116 7.4.1 Association Mining Issues 116 7.4.2 Research Initiatives and Projects 116 7.5 Literature Review 117 7.6 Methodology 119 7.6.1 Phase 1: Pattern Search 120 7.6.2 Phase 2: Rule Mining 120 7.6.3 Phase 3: Knowledge Derivation 121 7.7 Implementation 121 7.7.1 Test Issues 121 7.7.2 System Evaluation 121 7.7.2.1 Indicator Matrix Formulation 122 7.7.2.2 Phase 1: Frequent Pattern Derivation 123 7.7.2.3 Phase 2: Association Rule Framing 123 7.7.2.4 Phase 3: Knowledge Discovery Through Metaheuristic Implementation 123 7.8 Performance Analysis 124 7.9 Research Contributions and Future Work 125 7.10 Conclusion 126 References 126 8 Magnetic Resonance Image Segmentation Using a Quantum-Inspired Modified Genetic Algorithm (QIANA) Based on FRCM 129Sunanda Das, Sourav De, Sandip Dey, and Siddhartha Bhattacharyya 8.1 Introduction 129 8.2 Literature Survey 131 8.3 Quantum Computing 133 8.3.1 Quoit-Quantum Bit 133 8.3.2 Entanglement 133 8.3.3 Measurement 133 8.3.4 Quantum Gate 134 8.4 Some Quality Evaluation Indices for Image Segmentation 134 8.4.1 F(I) 134 8.4.2 F’(I) 135 8.4.3 Q(I) 135 8.5 Quantum-Inspired Modified Genetic Algorithm (QIANA)–Based FRCM 135 8.5.1 Quantum-Inspired MEGA (QIANA)–Based FRCM 136 8.6 Experimental Results and Discussion 139 8.7 Conclusion 147 References 147 9 A Hybrid Approach Using the k-means and Genetic Algorithms for Image Color Quantization 151Marcos Roberto e Souza, Anderson Carlos Sousa e Santos, and Helio Pedrini 9.1 Introduction 151 9.2 Background 152 9.3 Color Quantization Methodology 154 9.3.1 Crossover Operators 157 9.3.2 Mutation Operators 158 9.3.3 Fitness Function 158 9.4 Results and Discussions 159 9.5 Conclusions and Future Work 168 Acknowledgments 168 References 168 Index 173
£99.70
John Wiley & Sons Inc Essential Algorithms
Book SynopsisA friendly introduction to the most usefulalgorithms written in simple, intuitive English The revised and updated second edition of Essential Algorithms, offers an accessible introduction to computer algorithms. The book contains a description of important classical algorithms and explains when each is appropriate. The author shows how to analyze algorithms in order to understand their behavior and teaches techniques that the can be used to create new algorithms to meet future needs. The text includes useful algorithms such as: methods for manipulating common data structures, advanced data structures, network algorithms, and numerical algorithms. It also offers a variety of general problem-solving techniques. In addition to describing algorithms and approaches, the author offers details on how to analyze the performance of algorithms. The book is filled with exercises that can be used to explore ways to modify the algorithms in order to apply them to new Table of ContentsIntroduction xxix Chapter 1 Algorithm Basics 1 Approach 2 Algorithms and Data Structures 2 Pseudocode 3 Algorithm Features 6 Big O Notation 7 Rule 1 8 Rule 2 8 Rule 3 9 Rule 4 9 Rule 5 10 Common Run Time Functions 11 1 11 Log N 11 Sqrt N 14 N 14 N log N 15 N2 15 2N 15 N! 16 Visualizing Functions 16 Practical Considerations 18 Summary 19 Exercises 20 Chapter 2 Numerical Algorithms 23 Randomizing Data 23 Generating Random Values 23 Generating Values 24 Ensuring Fairness 26 Getting Fairness from Biased Sources 28 Randomizing Arrays 29 Generating Nonuniform Distributions 30 Making Random Walks 31 Making Self-Avoiding Walks 33 Making Complete Self-Avoiding Walks 34 Finding Greatest Common Divisors 36 Calculating Greatest Common Divisors 36 Extending Greatest Common Divisors 38 Performing Exponentiation 40 Working with Prime Numbers 42 Finding Prime Factors 42 Finding Primes 44 Testing for Primality 45 Performing Numerical Integration 47 The Rectangle Rule 48 The Trapezoid Rule 49 Adaptive Quadrature 50 Monte Carlo Integration 54 Finding Zeros 55 Gaussian Elimination 57 Forward Elimination 58 Back Substitution 60 The Algorithm 61 Least Squares Fits 62 Linear Least Squares 62 Polynomial Least Squares 64 Summary 67 Exercises 68 Chapter 3 Linked Lists 71 Basic Concepts 71 Singly Linked Lists 72 Iterating Over the List 73 Finding Cells 73 Using Sentinels 74 Adding Cells at the Beginning 75 Adding Cells at the End 76 Inserting Cells After Other Cells 77 Deleting Cells 78 Doubly Linked Lists 79 Sorted Linked Lists 81 Self-Organizing Linked Lists 82 Move to Front (MTF) 83 Swap 83 Count 84 Hybrid Methods 84 Pseudocode 85 Linked-List Algorithms 86 Copying Lists 86 Sorting with Insertionsort 87 Sorting with Selectionsort 88 Multithreaded Linked Lists 90 Linked Lists with Loops 91 Marking Cells 92 Using Hash Tables 93 List Retracing 94 List Reversal 95 Tortoise and Hare 98 Loops in Doubly Linked Lists 100 Summary 100 Exercises 101 Chapter 4 Arrays 103 Basic Concepts 103 One-Dimensional Arrays 106 Finding Items 106 Finding Minimum, Maximum, and Average 107 Finding Median 108 Finding Mode 109 Inserting Items 112 Removing Items 113 Nonzero Lower Bounds 114 Two Dimensions 114 Higher Dimensions 115 Triangular Arrays 118 Sparse Arrays 121 Find a Row or Column 123 Get a Value 124 Set a Value 125 Delete a Value 127 Matrices 129 Summary 131 Exercises 132 Chapter 5 Stacks and Queues 135 Stacks 135 Linked-List Stacks 136 Array Stacks 138 Double Stacks 139 Stack Algorithms 141 Reversing an Array 141 Train Sorting 142 Tower of Hanoi 143 Stack Insertionsort 145 Stack Selectionsort 146 Queues 147 Linked-List Queues 148 Array Queues 148 Specialized Queues 151 Priority Queues 151 Deques 152 Binomial Heaps 152 Binomial Trees 152 Binomial Heaps 154 Merging Trees 155 Merging Heaps 156 Merging Tree Lists 156 Merging Trees 158 Enqueue 161 Dequeue 162 Runtime 163 Summary 163 Exercises 164 Chapter 6 Sorting 167 O(N2 ) Algorithms 168 Insertionsort in Arrays 168 Selectionsort in Arrays 170 Bubblesort 171 O(NlogN) Algorithms 174 Heapsort 175 Storing Complete Binary Trees 175 Defining Heaps 176 Implementing Heapsort 180 Quicksort 181 Analyzing Quicksort’s Run Time 182 Picking a Dividing Item 184 Implementing Quicksort with Stacks 185 Implementing Quicksort in Place 185 Using Quicksort 188 Mergesort 189 Sub O(NlogN) Algorithms 192 Countingsort 192 Pigeonhole Sort 193 Bucketsort 195 Summary 197 Exercises 198 Chapter 7 Searching 201 Linear Search 202 Binary Search 203 Interpolation Search 204 Majority Voting 205 Summary 207 Exercises 208 Chapter 8 Hash Tables 209 Hash Table Fundamentals 210 Chaining 211 Open Addressing 213 Removing Items 214 Linear Probing 215 Quadratic Probing 217 Pseudorandom Probing 219 Double Hashing 219 Ordered Hashing 219 Summary 222 Exercises 222 Chapter 9 Recursion 227 Basic Algorithms 228 Factorial 228 Fibonacci Numbers 230 Rod-Cutting 232 Brute Force 233 Recursion 233 Tower of Hanoi 235 Graphical Algorithms 238 Koch Curves 239 Hilbert Curve 241 Sierpiński Curve 243 Gaskets 246 The Skyline Problem 247 Lists 248 Divide and Conquer 249 Backtracking Algorithms 252 Eight Queens Problem 254 Knight’s Tour 257 Selections and Permutations 260 Selections with Loops 261 Selections with Duplicates 262 Selections without Duplicates 264 Permutations with Duplicates 265 Permutations without Duplicates 266 Round-Robin Scheduling 267 Odd Number of Teams 268 Even Number of Teams 270 Implementation 271 Recursion Removal 273 Tail Recursion Removal 274 Dynamic Programming 275 Bottom-Up Programming 277 General Recursion Removal 277 Summary 280 Exercises 281 Chapter 10 Trees 285 Tree Terminology 285 Binary Tree Properties 289 Tree Representations 292 Building Trees in General 292 Building Complete Trees 295 Tree Traversal 296 Preorder Traversal 297 Inorder Traversal 299 Postorder Traversal 300 Breadth-First Traversal 301 Traversal Uses 302 Traversal Run Times 303 Sorted Trees 303 Adding Nodes 303 Finding Nodes 306 Deleting Nodes 306 Lowest Common Ancestors 309 Sorted Trees 309 Parent Pointers 310 Parents and Depths 311 General Trees 312 Euler Tours 314 All Pairs 316 Threaded Trees 317 Building Threaded Trees 318 Using Threaded Trees 320 Specialized Tree Algorithms 322 The Animal Game 322 Expression Evaluation 324 Interval Trees 326 Building the Tree 328 Intersecting with Points 329 Intersecting with Intervals 330 Quadtrees 332 Adding Items 335 Finding Items 336 Tries 337 Adding Items 339 Finding Items 341 Summary 342 Exercises 342 Chapter 11 Balanced Trees 349 AVL Trees 350 Adding Values 350 Deleting Values 353 2-3 Trees 354 Adding Values 355 Deleting Values 356 B-Trees 359 Adding Values 360 Deleting Values 361 Balanced Tree Variations 362 Top-down B-trees 363 B+trees 363 Summary 365 Exercises 365 Chapter 12 Decision Trees 367 Searching Game Trees 368 Minimax 369 Initial Moves and Responses 373 Game Tree Heuristics 374 Searching General Decision Trees 375 Optimization Problems 376 Exhaustive Search 377 Branch and Bound 379 Decision Tree Heuristics 381 Random Search 381 Improving Paths 382 Simulated Annealing 384 Hill Climbing 385 Sorted Hill Climbing 386 Other Decision Tree Problems 387 Generalized Partition Problem 387 Subset Sum 388 Bin Packing 388 Cutting Stock 389 Knapsack 390 Traveling Salesman Problem 391 Satisfiability 391 Swarm Intelligence 392 Ant Colony Optimization 393 General Optimization 393 Traveling Salesman 393 Bees Algorithm 394 Swarm Simulation 394 Boids 395 Pseudoclassical Mechanics 396 Goals and Obstacles 397 Summary 397 Exercises 398 Chapter 13 Basic Network Algorithms 403 Network Terminology 403 Network Representations 407 Traversals 409 Depth-First Traversal 410 Breadth-First Traversal 412 Connectivity Testing 413 Spanning Trees 416 Minimal Spanning Trees 417 Euclidean Minimum Spanning Trees 418 Building Mazes 419 Strongly Connected Components 420 Kosaraju’s Algorithm 421 Algorithm Discussion 422 Finding Paths 425 Finding Any Path 425 Label-Setting Shortest Paths 426 Label-Correcting Shortest Paths 430 All-Pairs Shortest Paths 431 Transitivity 436 Transitive Closure 437 Transitive Reduction 438 Acyclic Networks 439 General Networks 440 Shortest Path Modifications 441 Shape Points 441 Early Stopping 442 Bidirectional Search 442 Best-First Search 442 Turn Penalties and Prohibitions 443 Geometric Calculations 443 Expanded Node Networks 444 Interchange Networks 445 Summary 447 Exercises 447 Chapter 14 More Network Algorithms 451 Topological Sorting 451 Cycle Detection 455 Map Coloring 456 Two-Coloring 456 Three-Coloring 458 Four-Coloring 459 Five-Coloring 459 Other Map-Coloring Algorithms 462 Maximal Flow 464 Work Assignment 467 Minimal Flow Cut 468 Network Cloning 470 Dictionaries 471 Clone References 472 Cliques 473 Brute Force 474 Bron–Kerbosch 475 Sets R, P, and X 475 Recursive Calls 476 Pseudocode 476 Example 477 Variations 480 Finding Triangles 480 Brute Force 481 Checking Local Links 481 Chiba and Nishizeki 482 Community Detection 483 Maximal Cliques 483 Girvan–Newman 483 Clique Percolation 485 Eulerian Paths and Cycles 485 Brute Force 486 Fleury’s Algorithm 486 Hierholzer’s Algorithm 487 Summary 488 Exercises 489 Chapter 15 String Algorithms 493 Matching Parentheses 494 Evaluating Arithmetic Expressions 495 Building Parse Trees 496 Pattern Matching 497 DFAs 497 Building DFAs for Regular Expressions 500 NFAs 502 String Searching 504 Calculating Edit Distance 508 Phonetic Algorithms 511 Soundex 511 Metaphone 513 Summary 514 Exercises 515 Chapter 16 Cryptography 519 Terminology 520 Transposition Ciphers 521 Row/Column Transposition 521 Column Transposition 523 Route Ciphers 525 Substitution Ciphers 526 Caesar Substitution 526 Vigenere Cipher 527 Simple Substitution 529 One-Time Pads 530 Block Ciphers 531 Substitution-Permutation Networks 531 Feistel Ciphers 533 Public-Key Encryption and RSA 534 Euler’s Totient Function 535 Multiplicative Inverses 536 An RSA Example 536 Practical Considerations 537 Other Uses for Cryptography 538 Summary 539 Exercises 540 Chapter 17 Complexity Theory 543 Notation 544 Complexity Classes 545 Reductions 548 3SAT 549 Bipartite Matching 550 NP-Hardness 550 Detection, Reporting, and Optimization Problems 551 Detection ≤p Reporting 552 Reporting ≤p Optimization 552 Reporting ≤p Detection 552 Optimization ≤p Reporting 553 Approximate Optimization 553 NP-Complete Problems 554 Summary 557 Exercises 558 Chapter 18 Distributed Algorithms 561 Types of Parallelism 562 Systolic Arrays 562 Distributed Computing 565 Multi-CPU Processing 567 Race Conditions 567 Deadlock 571 Quantum Computing 572 Distributed Algorithms 573 Debugging Distributed Algorithms 573 Embarrassingly Parallel Algorithms 574 Mergesort 576 Dining Philosophers 577 Randomization 578 Resource Hierarchy 578 Waiter 579 Chandy/Misra 579 The Two Generals Problem 580 Byzantine Generals 581 Consensus 584 Leader Election 587 Snapshot 588 Clock Synchronization 589 Summary 591 Exercises 591 Chapter 19 Interview Puzzles 595 Asking Interview Puzzle Questions 597 Answering Interview Puzzle Questions 598 Summary 602 Exercises 604 Appendix A Summary of Algorithmic Concepts 607 Chapter 1: Algorithm Basics 607 Chapter 2: Numeric Algorithms 608 Chapter 3: Linked Lists 609 Chapter 4: Arrays 610 Chapter 5: Stacks and Queues 610 Chapter 6: Sorting 610 Chapter 7: Searching 611 Chapter 8: Hash Tables 612 Chapter 9: Recursion 612 Chapter 10: Trees 614 Chapter 11: Balanced Trees 615 Chapter 12: Decision Trees 615 Chapter 13: Basic Network Algorithms 616 Chapter 14: More Network Algorithms 617 Chapter 15: String Algorithms 618 Chapter 16: Cryptography 618 Chapter 17: Complexity Theory 619 Chapter 18: Distributed Algorithms 620 Chapter 19: Interview Puzzles 621 Appendix B Solutions to Exercises 623 Chapter 1: Algorithm Basics 623 Chapter 2: Numerical Algorithms 626 Chapter 3: Linked Lists 633 Chapter 4: Arrays 638 Chapter 5: Stacks and Queues 648 Chapter 6: Sorting 650 Chapter 7: Searching 653 Chapter 8: Hash Tables 655 Chapter 9: Recursion 658 Chapter 10: Trees 663 Chapter 11: Balanced Trees 670 Chapter 12: Decision Trees 675 Chapter 13: Basic Network Algorithms 678 Chapter 14: More Network Algorithms 681 Chapter 15: String Algorithms 686 Chapter 16: Encryption 689 Chapter 17: Complexity Theory 692 Chapter 18: Distributed Algorithms 697 Chapter 19: Interview Puzzles 701 Glossary 711 Index 739
£37.50
John Wiley & Sons Inc Role of Edge Analytics in Sustainable Smart City
Book SynopsisEfficient Single Board Computers (SBCs) and advanced VLSI systems have resulted in edge analytics and faster decision making. The QoS parameters like energy, delay, reliability, security, and throughput should be improved on seeking better intelligent expert systems. The resource constraints in the Edge devices, challenges the researchers to meet the required QoS. Since these devices and components work in a remote unattended environment, an optimum methodology to improve its lifetime has become mandatory. Continuous monitoring of events is mandatory to avoid tragic situations; it can only be enabled by providing high QoS. The applications of IoT in digital twin development, health care, traffic analysis, home surveillance, intelligent agriculture monitoring, defense and all common day to day activities have resulted in pioneering embedded devices, which can offer high computational facility without much latency and delay. The book address industrial problems in designing expert systemTable of ContentsPreface xv 1 Smart Health Care Development: Challenges and Solutions 1R. Sujatha, E.P. Ephzibah and S. Sree Dharinya 1.1 Introduction 2 1.2 ICT Explosion 3 1.2.1 RFID 4 1.2.2 IoT and Big Data 5 1.2.3 Wearable Sensors—Head to Toe 7 1.2.4 Cloud Computing 8 1.3 Intelligent Healthcare 10 1.4 Home Healthcare 11 1.5 Data Analytics 11 1.6 Technologies—Data Cognitive 13 1.6.1 Machine Learning 13 1.6.2 Image Processing 14 1.6.3 Deep Learning 14 1.7 Adoption Technologies 15 1.8 Conclusion 15 References 15 2 Working of Mobile Intelligent Agents on the Web—A Survey 21P.R. Joe Dhanith and B. Surendiran 2.1 Introduction 21 2.2 Mobile Crawler 23 2.3 Comparative Study of the Mobile Crawlers 47 2.4 Conclusion 47 References 47 3 Power Management Scheme for Photovoltaic/Battery Hybrid System in Smart Grid 49T. Bharani Prakash and S. Nagakumararaj 3.1 Power Management Scheme 50 3.2 Internal Power Flow Management 50 3.2.1 PI Controller 51 3.2.2 State of Charge 53 3.3 Voltage Source Control 54 3.3.1 Phase-Locked Loop 55 3.3.2 Space Vector Pulse Width Modulation 56 3.3.3 Park Transformation (abc to dq0) 57 3.4 Simulation Diagram and Results 58 3.4.1 Simulation Diagram 58 3.4.2 Simulation Results 63 Conclusion 65 4 Analysis: A Neural Network Equalizer for Channel Equalization by Particle Swarm Optimization for Various Channel Models 67M. Muthumari, D.C. Diana and C. Ambika Bhuvaneswari 4.1 Introduction 68 4.2 Channel Equalization 72 4.2.1 Channel Models 73 4.2.1.1 Tapped Delay Line Model 74 4.2.1.2 Stanford University Interim (SUI) Channel Models 75 4.2.2 Artificial Neural Network 75 4.3 Functional Link Artificial Neural Network 76 4.4 Particle Swarm Optimization 76 4.5 Result and Discussion 77 4.5.1 Convergence Analysis 77 4.5.2 Comparison Between Different Parameters 79 4.5.3 Comparison Between Different Channel Models 80 4.6 Conclusion 81 References 82 5 Implementing Hadoop Container Migrations in OpenNebula Private Cloud Environment 85P. Kalyanaraman, K.R. Jothi, P. Balakrishnan, R.G. Navya, A. Shah and V. Pandey 5.1 Introduction 86 5.1.1 Hadoop Architecture 86 5.1.2 Hadoop and Big Data 88 5.1.3 Hadoop and Virtualization 88 5.1.4 What is OpenNebula? 89 5.2 Literature Survey 90 5.2.1 Performance Analysis of Hadoop 90 5.2.2 Evaluating Map Reduce on Virtual Machines 91 5.2.3 Virtualizing Hadoop Containers 94 5.2.4 Optimization of Hadoop Cluster Using Cloud Platform 95 5.2.5 Heterogeneous Clusters in Cloud Computing 96 5.2.6 Performance Analysis and Optimization in Hadoop 97 5.2.7 Virtual Technologies 97 5.2.8 Scheduling 98 5.2.9 Scheduling of Hadoop VMs 98 5.3 Discussion 99 5.4 Conclusion 100 References 101 6 Transmission Line Inspection Using Unmanned Aerial Vehicle 105A. Mahaboob Subahani, M. Kathiresh and S. Sanjeev 6.1 Introduction 106 6.1.1 Unmanned Aerial Vehicle 106 6.1.2 Quadcopter 106 6.2 Literature Survey 107 6.3 System Architecture 108 6.4 ArduPilot 109 6.5 Arduino Mega 111 6.6 Brushless DC Motor 111 6.7 Battery 112 6.8 CMOS Camera 113 6.9 Electronic Speed Control 113 6.10 Power Module 115 6.11 Display Shield 116 6.12 Navigational LEDS 116 6.13 Role of Sensors in the Proposed System 118 6.13.1 Accelerometer and Gyroscope 118 6.13.2 Magnetometer 118 6.13.3 Barometric Pressure Sensor 119 6.13.4 Global Positioning System 119 6.14 Wireless Communication 120 6.15 Radio Controller 120 6.16 Telemetry Radio 121 6.17 Camera Transmitter 121 6.18 Results and Discussion 121 6.19 Conclusion 124 References 125 7 Smart City Infrastructure Management System Using IoT 127S. Ramamoorthy, M. Kowsigan, P. Balasubramanie and P. John Paul 7.1 Introduction 128 7.2 Major Challenges in IoT-Based Technology 129 7.2.1 Peer to Peer Communication Security 129 7.2.2 Objective of Smart Infrastructure 130 7.3 Internet of Things (IoT) 131 7.3.1 Key Components of Components of IoT 131 7.3.1.1 Network Gateway 132 7.3.1.2 HTTP (HyperText Transfer Protocol) 132 7.3.1.3 LoRaWan (Long Range Wide Area Network) 133 7.3.1.4 Bluetooth 133 7.3.1.5 ZigBee 133 7.3.2 IoT Data Protocols 133 7.3.2.1 Message Queue Telemetry Transport (MQTT) 133 7.3.2.2 Constrained Application Protocol (CoAP) 134 7.3.2.3 Advanced Message Queuing Protocol (AMQP) 134 7.3.2.4 Data Analytics 134 7.4 Machine Learning-Based Smart Decision-Making Process 135 7.5 Cloud Computing 136 References 138 8 Lightweight Cryptography Algorithms for IoT Resource-Starving Devices 139S. Aruna, G. Usha, P. Madhavan and M.V. Ranjith Kumar 8.1 Introduction 139 8.1.1 Need of the Cryptography 140 8.2 Challenges on Lightweight Cryptography 141 8.3 Hashing Techniques on Lightweight Cryptography 142 8.4 Applications on Lighweight Cryptography 152 8.5 Conclusion 167 References 168 9 Pre-Learning-Based Semantic Segmentation for LiDAR Point Cloud Data Using Self-Organized Map 171K. Rajathi and P. Sarasu 9.1 Introduction 172 9.2 Related Work 173 9.2.1 Semantic Segmentation for Images 173 9.3 Semantic Segmentation for LiDAR Point Cloud 173 9.4 Proposed Work 175 9.4.1 Data Acquisition 175 9.4.2 Our Approach 175 9.4.3 Pre-Learning Processing 179 9.5 Region of Interest (RoI) 180 9.6 Registration of Point Cloud 181 9.7 Semantic Segmentation 181 9.8 Self-Organized Map (SOM) 182 9.9 Experimental Result 183 9.10 Conclusion 186 References 187 10 Smart Load Balancing Algorithms in Cloud Computing—A Review 189K.R. Jothi, S. Anto, M. Kohar, M. Chadha and P. Madhavan 10.1 Introduction 189 10.2 Research Challenges 192 10.2.1 Security & Routing 192 10.2.2 Storage/Replication 192 10.2.3 Spatial Spread of the Cloud Nodes 192 10.2.4 Fault Tolerance 193 10.2.5 Algorithm Complexity 193 10.3 Literature Survey 193 10.4 Survey Table 201 10.5 Discussion & Comparison 202 10.6 Conclusion 202 References 216 11 A Low-Cost Wearable Remote Healthcare Monitoring System 219Konguvel Elango and Kannan Muniandi 11.1 Introduction 219 11.1.1 Problem Statement 220 11.1.2 Objective of the Study 221 11.2 Related Works 222 11.2.1 Remote Healthcare Monitoring Systems 222 11.2.2 Pulse Rate Detection 224 11.2.3 Temperate Measurement 225 11.2.4 Fall Detection 225 11.3 Methodology 226 11.3.1 NodeMCU 226 11.3.2 Pulse Rate Detection System 227 11.3.3 Fall Detection System 230 11.3.4 Temperature Detection System 231 11.3.5 LCD Specification 234 11.3.6 ADC Specification 234 11.4 Results and Discussions 236 11.4.1 System Implementation 236 11.4.2 Fall Detection Results 236 11.4.3 ThingSpeak 236 11.5 Conclusion 239 11.6 Future Scope 240 References 241 12 IoT-Based Secure Smart Infrastructure Data Management 243R. Poorvadevi, M. Kowsigan, P. Balasubramanie and J. Rajeshkumar 12.1 Introduction 244 12.1.1 List of Security Threats Related to the Smart IoT Network 244 12.1.2 Major Application Areas of IoT 244 12.1.3 IoT Threats and Security Issues 245 12.1.4 Unpatched Vulnerabilities 245 12.1.5 Weak Authentication 245 12.1.6 Vulnerable API’s 245 12.2 Types of Threats to Users 245 12.3 Internet of Things Security Management 246 12.3.1 Managing IoT Devices 246 12.3.2 Role of External Devices in IoT Platform 247 12.3.3 Threats to Other Computer Networks 248 12.4 Significance of IoT Security 249 12.4.1 Aspects of Workplace Security 249 12.4.2 Important IoT Security Breaches and IoT Attacks 250 12.5 IoT Security Tools and Legislation 250 12.6 Protection of IoT Systems and Devices 251 12.6.1 IoT Issues and Security Challenges 251 12.6.2 Providing Secured Connections 252 12.7 Five Ways to Secure IoT Devices 253 12.8 Conclusion 255 References 255 13 A Study of Addiction Behavior for Smart Psychological Health Care System 257V. Sabapathi and K.P. Vijayakumar 13.1 Introduction 258 13.2 Basic Criteria of Addiction 258 13.3 Influencing Factors of Addiction Behavior 259 13.3.1 Peers Influence 259 13.3.2 Environment Influence 260 13.3.3 Media Influence 262 13.3.4 Family Group and Society 262 13.4 Types of Addiction and Their Effects 262 13.4.1 Gaming Addiction 263 13.4.2 Pornography Addiction 264 13.4.3 Smart Phone Addiction 265 13.4.4 Gambling Addiction 267 13.4.5 Food Addiction 267 13.4.6 Sexual Addiction 268 13.4.7 Cigarette and Alcohol Addiction 268 13.4.8 Status Expressive Addiction 269 13.4.9 Workaholic Addiction 269 13.5 Conclusion 269 References 270 14 A Custom Cluster Design With Raspberry Pi for Parallel Programming and Deployment of Private Cloud 273Sukesh, B., Venkatesh, K. and Srinivas, L.N.B. 14.1 Introduction 274 14.2 Cluster Design with Raspberry Pi 276 14.2.1 Assembling Materials for Implementing Cluster 276 14.2.1.1 Raspberry Pi4 277 14.2.1.2 RPi 4 Model B Specifications 277 14.2.2 Setting Up Cluster 278 14.2.2.1 Installing Raspbian and Configuring Master Node 279 14.2.2.2 Installing MPICH and MPI4PY 279 14.2.2.3 Cloning the Slave Nodes 279 14.3 Parallel Computing and MPI on Raspberry Pi Cluster 279 14.4 Deployment of Private Cloud on Raspberry Pi Cluster 281 14.4.1 NextCloud Software 281 14.5 Implementation 281 14.5.1 NextCloud on RPi Cluster 281 14.5.2 Parallel Computing on RPi Cluster 282 14.6 Results and Discussions 286 14.7 Conclusion 287 References 287 15 Energy Efficient Load Balancing Technique for Distributed Data Transmission Using Edge Computing 289Karthikeyan, K. and Madhavan, P. 15.1 Introduction 290 15.2 Energy Efficiency Offloading Data Transmission 290 15.2.1 Web-Based Offloading 291 15.3 Energy Harvesting 291 15.3.1 LODCO Algorithm 292 15.4 User-Level Online Offloading Framework (ULOOF) 293 15.5 Frequency Scaling 294 15.6 Computation Offloading and Resource Allocation 295 15.7 Communication Technology 296 15.8 Ultra-Dense Network 297 15.9 Conclusion 299 References 299 16 Blockchain-Based SDR Signature Scheme With Time-Stamp 303Swathi Singh, Divya Satish and Sree Rathna Lakshmi 16.1 Introduction 303 16.2 Literature Study 304 16.2.1 Signatures With Hashes 304 16.2.2 Signature Scheme With Server Support 305 16.2.3 Signatures Scheme Based on Interaction 305 16.3 Methodology 306 16.3.1 Preliminaries 306 16.3.1.1 Hash Trees 306 16.3.1.2 Chains of Hashes 306 16.3.2 Interactive Hash-Based Signature Scheme 307 16.3.3 Significant Properties of Hash-Based Signature Scheme 309 16.3.4 Proposed SDR Scheme Structure 310 16.3.4.1 One-Time Keys 310 16.3.4.2 Server Behavior Authentication 310 16.3.4.3 Pre-Authentication by Repository 311 16.4 SDR Signature Scheme 311 16.4.1 Pre-Requisites 311 16.4.2 Key Generation Algorithm 312 16.4.2.1 Server 313 16.4.3 Sign Algorithm 313 16.4.3.1 Signer 313 16.4.3.2 Server 313 16.4.3.3 Repository 314 16.4.4 Verification Algorithm 314 16.5 Supportive Theory 315 16.5.1 Signing Algorithm Supported by Server 315 16.5.2 Repository Deployment 316 16.5.3 SDR Signature Scheme Setup 316 16.5.4 Results and Observation 316 16.6 Conclusion 317 References 317 Index 321
£164.66
John Wiley & Sons Inc Algorithms in Bioinformatics
Book SynopsisALGORITHMS IN BIOINFORMATICS Explore a comprehensive and insightful treatment of the practical application of bioinformatic algorithms in a variety of fields Algorithms in Bioinformatics: Theory and Implementation delivers a fulsome treatment of some of the main algorithms used to explain biological functions and relationships. It introduces readers to the art of algorithms in a practical manner which is linked with biological theory and interpretation. The book covers many key areas of bioinformatics, including global and local sequence alignment, forced alignment, detection of motifs, Sequence logos, Markov chains or information entropy. Other novel approaches are also described, such as Self-Sequence alignment, Objective Digital Stains (ODSs) or Spectral Forecast and the Discrete Probability Detector (DPD) algorithm. The text incorporates graphical illustrations to highlight and emphasize the technical details oTable of ContentsPreface xv About the Companion Website xvii 1 The Tree of Life (I) 1 1.1 Introduction 1 1.2 Emergence of Life 1 1.2.1 Timeline Disagreements 3 1.3 Classifications and Mechanisms 4 1.4 Chromatin Structure 5 1.5 Molecular Mechanisms 9 1.5.1 Precursor Messenger RNA 9 1.5.2 Precursor Messenger RNA to Messenger RNA 10 1.5.3 Classes of Introns 10 1.5.4 Messenger RNA 10 1.5.5 mRNA to Proteins 11 1.5.6 Transfer RNA 12 1.5.7 Small RNA 12 1.5.8 The Transcriptome 13 1.5.9 Gene Networks and Information Processing 13 1.5.10 Eukaryotic vs. Prokaryotic Regulation 14 1.5.11 What Is Life? 14 1.6 Known Species 14 1.7 Approaches for Compartmentalization 15 1.7.1 Two Main Approaches for Organism Formation 16 1.7.2 Size and Metabolism 16 1.8 Sizes in Eukaryotes 16 1.8.1 Sizes in Unicellular Eukaryotes 17 1.8.2 Sizes in Multicellular Eukaryotes 17 1.9 Sizes in Prokaryotes 17 1.10 Virus Sizes 18 1.10.1 Viruses vs. the Spark of Metabolism 20 1.11 The Diffusion Coefficient 20 1.12 The Origins of Eukaryotic Cells 21 1.12.1 Endosymbiosis Theory 21 1.12.2 DNA and Organelles 22 1.12.3 Membrane-bound Organelles with DNA 23 1.12.4 Membrane-bound Organelles Without DNA 23 1.12.5 Control and Division of Organelles 24 1.12.6 The Horizontal Gene Transfer 24 1.12.7 On the Mechanisms of Horizontal Gene Transfer 25 1.13 Origins of Eukaryotic Multicellularity 26 1.13.1 Colonies Inside an Early Unicellular Common Ancestor 26 1.13.2 Colonies of Early Unicellular Common Ancestors 26 1.13.3 Colonies of Inseparable Early Unicellular Common Ancestors 1.13.4 Chimerism and Mosaicism 28 1.14 Conclusions 29 2 Tree of Life: Genomes (II) 31 2.1 Introduction 31 2.2 Rules of Engagement 31 2.3 Genome Sizes in the Tree of Life 32 2.3.1 Alternative Methods 33 2.3.2 The Weaving of Scales 33 2.3.3 Computations on the Average Genome Size 36 2.3.4 Observations on Data 38 2.4 Organellar Genomes 40 2.4.1 Chloroplasts 40 2.4.2 Apicoplasts 40 2.4.3 Chromatophores 42 2.4.4 Cyanelles 42 2.4.5 Kinetoplasts 42 2.4.6 Mitochondria 43 2.5 Plasmids 43 2.6 Virus Genomes 44 2.7 Viroids and Their Implications 46 2.8 Genes vs. Proteins in the Tree of Life 47 2.9 Conclusions 49 3 Sequence Alignment (I) 51 3.1 Introduction 51 3.2 Style and Visualization 51 3.3 Initialization of the Score Matrix 54 3.4 Calculation of Scores 57 3.4.1 Initialization of the Score Matrix for Global Alignment 57 3.4.2 Initialization of the Score Matrix for Local Alignment 62 3.4.3 Optimization of the Initialization Steps 65 3.4.4 Curiosities 66 3.5 Traceback 71 3.6 Global Alignment 75 3.7 Local Alignment 79 3.8 Alignment Layout 84 3.9 Local Sequence Alignment – The Final Version 87 3.10 Complementarity 91 3.11 Conclusions 97 4 Forced Alignment (II) 99 4.1 Introduction 99 4.2 Global and Local Sequence Alignment 100 4.2.1 Short Notes 100 4.2.2 Understanding the Technology 101 4.2.3 Main Objectives 102 4.3 Experiments and Discussions 102 4.3.1 Alignment Layout 106 4.3.2 Forced Alignment Regime 106 4.3.3 Alignment Scores and Significance 109 4.3.4 Optimal Alignments 110 4.3.5 The Main Significance Scores 110 4.3.6 The Information Content 110 4.3.7 The Match Percentage 112 4.3.8 Significance vs. Chance 113 4.3.9 The Importance of Randomness 113 4.3.10 Sequence Quality and the Score Matrix 114 4.3.11 The Significance Threshold 115 4.3.12 Optimal Alignments by Numbers 116 4.3.13 Chaos Theory on Sequence Alignment 116 4.3.14 Image-Encoding Possibilities 116 4.4 Advanced Features and Methods 117 4.4.1 Sequence Detector 117 4.4.2 Parameters 117 4.4.3 Heatmap 118 4.4.4 Text Visualization 123 4.4.5 Graphics for Manuscript Figures and Didactic Presentations 124 4.4.6 Dynamics 124 4.4.7 Independence 125 4.4.8 Limits 125 4.4.9 Local Storage 125 4.5 Conclusions 128 5 Self-Sequence Alignment (I) 129 5.1 Introduction 129 5.2 True Randomness 130 5.3 Information and Compression Algorithms 130 5.4 White Noise and Biological Sequences 131 5.5 The Mathematical Model 131 5.5.1 A Concrete Example 132 5.5.2 Model Dissection 133 5.5.3 Conditions for Maxima and Minima 136 5.6 Noise vs. Redundancy 137 5.7 Global and Local Information Content 137 5.8 Signal Sensitivity 138 5.9 Implementation 140 5.9.1 Global Self-Sequence Alignment 140 5.9.2 Local Self-Sequence Alignment 144 5.10 A Complete Scanner for Information Content 147 5.11 Conclusions 149 6 Frequencies and Percentages (II) 151 6.1 Introduction 151 6.2 Base Composition 152 6.3 Percentage of Nucleotide Combinations 152 6.4 Implementation 153 6.5 A Frequency Scanner 156 6.6 Examples of Known Significance 158 6.7 Observation vs. Expectation 160 6.8 A Frequency Scanner with a Threshold 161 6.9 Conclusions 163 7 Objective Digital Stains (III) 165 7.1 Introduction 165 7.2 Information and Frequency 166 7.3 The Objective Digital Stain 169 7.3.1 A 3D Representation Over a 2D Plane 173 7.3.2 ODSs Relative to the Background 177 7.4 Interpretation of ODSs 181 7.5 The Significance of the Areas in the ODS 183 7.6 Discussions 184 7.6.1 A Similarity Between Dissimilar Sequences 186 7.7 Conclusions 186 8 Detection of Motifs (I) 187 8.1 Introduction 187 8.2 DNA Motifs 187 8.2.1 DNA-binding Proteins vs. Motifs and Degeneracy 188 8.2.2 Concrete Examples of DNA Motifs 188 8.3 Major Functions of DNA Motifs 191 8.3.1 RNA Splicing and DNA Motifs 191 8.4 Conclusions 195 9 Representation of Motifs (II) 197 9.1 Introduction 197 9.2 The Training Data 197 9.3 A Visualization Function 198 9.4 The Alignment Matrix 200 9.5 Alphabet Detection 203 9.6 The Position-Specific Scoring Matrix (PSSM) Initialization 206 9.7 The Position Frequency Matrix (PFM) 207 9.8 The Position Probability Matrix (PPM) 208 9.8.1 A Kind of PPM Pseudo-Scanner 209 9.9 The Position Weight Matrix (PWM) 212 9.10 The Background Model 215 9.11 The Consensus Sequence 218 9.11.1 The Consensus – Not Necessarily Functional 219 9.12 Mutational Intolerance 221 9.13 From Motifs to PWMs 222 9.14 Pseudo-Counts and Negative Infinity 226 9.15 Conclusions 229 10 The Motif Scanner (III) 231 10.1 Introduction 231 10.2 Looking for Signals 232 10.3 A Functional Scanner 235 10.4 The Meaning of Scores 239 10.4.1 A Score Value Above Zero 239 10.4.2 A Score Value Below Zero 241 10.4.3 A Score Value of Zero 241 10.5 Conclusions 242 11 Understanding the Parameters (IV) 243 11.1 Introduction 243 11.2 Experimentation 243 11.2.1 A Scanner Implementation Based on Pseudo-Counts 244 11.2.2 A Scanner Implementation Based on Propagation of Zero Counts 246 11.3 Signal Discrimination 249 11.4 False-Positive Results 250 11.5 Sensitivity Adjustments 251 11.6 Beyond Bioinformatics 252 11.7 A Scanner That Uses a Known PWM 253 11.8 Signal Thresholds 256 11.8.1 Implementation and Filter Testing 258 11.9 Conclusions 262 12 Dynamic Backgrounds (V) 263 12.1 Introduction 263 12.2 Toward a Scanner with Two PFMs 263 12.2.1 The Implementation of Dynamic PWMs 264 12.2.2 Issues and Corrections for Dynamic PWMs 271 12.2.3 Solutions for Aberrant Positive Likelihood Values 274 12.3 A Scanner with Two PFMs 280 12.4 Information and Background Frequencies on Score Values 283 12.5 Dynamic Background vs. Null Model 285 12.6 Conclusions 285 13 Markov Chains: The Machine (I) 287 13.1 Introduction 287 13.2 Transition Matrices 287 13.3 Discrete Probability Detector 292 13.3.1 Alphabet Detection 292 13.3.2 Matrix Initialization 293 13.3.3 Frequency Detection 295 13.3.4 Calculation of Transition Probabilities 297 13.3.5 Particularities in Calculating the Transition Probabilities 306 13.4 Markov Chains Generators 307 13.4.1 The Experiment 308 13.4.2 The Implementation 312 13.4.3 Simulation of Transition Probabilities 315 13.4.4 The Markov machine 315 13.4.5 Result Verification 317 13.5 Conclusions 318 14 Markov Chains: Log Likelihood (II) 319 14.1 Introduction 319 14.2 The Log-Likelihood Matrix 319 14.2.1 A Log-Likelihood Matrix Based on the Null Model 320 14.2.2 A Log-Likelihood Matrix Based on Two Models 322 14.3 Interpretation and Use of the Log-Likelihood Matrix 326 14.4 Construction of a Markov Scanner 328 14.5 A Scanner That Uses a Known LLM 337 14.6 The Meaning of Scores 340 14.7 Beyond Bioinformatics 344 14.8 Conclusions 345 15 Spectral Forecast (I) 347 15.1 Introduction 347 15.2 The Spectral Forecast Model 347 15.3 The Spectral Forecast Equation 349 15.4 The Spectral Forecast Inner Workings 350 15.4.1 Each Part on a Single Matrix 351 15.4.2 Both Parts on a Single Matrix 352 15.4.3 Both Parts on Separate Matrices 353 15.4.4 Concrete Example 1 354 15.4.5 Concrete Example 2 357 15.4.6 Concrete Example 3 359 15.5 Implementations 360 15.5.1 Spectral Forecast for Signals 362 15.5.2 What Does the Value of d Mean? 364 15.5.3 Spectral Forecast for Matrices 368 15.6 The Spectral Forecast Model for Predictions 372 15.6.1 The Spectral Forecast Model for Signals 372 15.6.2 Experiments on the Similarity Index Values 381 15.6.3 The Spectral Forecast Model for Matrices 384 15.7 Conclusions 389 16 Entropy vs. Content (I) 391 16.1 Introduction 391 16.2 Information Entropy 391 16.3 Implementation 395 16.4 Information Content vs. Information Entropy 400 16.4.1 Implementation 403 16.4.2 Additional Considerations 409 16.5 Conclusions 409 17 Philosophical Transactions 411 17.1 Introduction 411 17.2 The Frame of Reference 411 17.2.1 The Fundamental Layer of Complexity 412 17.2.2 On the Complexity of Life 414 17.3 Random vs. Pseudo-random 415 17.4 Random Numbers and Noise 418 17.5 Determinism and Chaos 419 17.5.1 Chaos Without Noise 420 17.5.2 Chaos with Noise 427 17.5.3 Limits of Prediction 430 17.5.4 On the Wings of Chaos 431 17.6 Free Will and Determinism 431 17.6.1 The Greatest Disappointment 432 17.6.2 The Most Powerful Processor in Existence 433 17.6.3 Certainty vs. Interpretation 435 17.6.4 A Wisdom that Applies 436 17.7 Conclusions 439 Appendix A 441 A.1 Association of Numerical Values with Letters 441 A.2 Sorting Values on Columns 443 A.3 The Implementation of a Sequence Logo 446 A.4 Sequence Logos Based on Maximum Values 451 A.5 Using Logarithms to Build Sequence Logos 455 A.6 From a Motif Set to a Sequence Logo 459 References 467 Index 489
£101.66
John Wiley & Sons Inc Algorithms For Dummies
Book SynopsisTable of ContentsIntroduction 1 Part 1: Getting Started with Algorithms 7 Chapter 1: Introducing Algorithms 9 Chapter 2: Considering Algorithm Design 23 Chapter 3: Working with Google Colab 41 Chapter 4: Performing Essential Data Manipulations Using Python 59 Chapter 5: Developing a Matrix Computation Class 79 Part 2: Understanding the Need to Sort and Search 97 Chapter 6: Structuring Data 99 Chapter 7: Arranging and Searching Data 117 Part 3: Exploring the World of Graphs 139 Chapter 8: Understanding Graph Basics 141 Chapter 9: Reconnecting the Dots 161 Chapter 10: Discovering Graph Secrets 195 Chapter 11: Getting the Right Web page 207 Part 4: Wrangling Big Data 223 Chapter 12: Managing Big Data 225 Chapter 13: Parallelizing Operations 249 Chapter 14: Compressing and Concealing Data 267 Part 5: Challenging Difficult Problems 289 Chapter 15: Working with Greedy Algorithms 291 Chapter 16: Relying on Dynamic Programming 307 Chapter 17: Using Randomized Algorithms 331 Chapter 18: Performing Local Search 349 Chapter 19: Employing Linear Programming 367 Chapter 20: Considering Heuristics 381 Part 6: The Part of Tens 401 Chapter 21: Ten Algorithms That Are Changing the World 403 Chapter 22: Ten Algorithmic Problems Yet to Solve 411 Index 417 ntroduction 1 Part 1: Getting Started with Algorithms 7 Chapter 1: Introducing Algorithms 9 Chapter 2: Considering Algorithm Design 23 Chapter 3: Working with Google Colab 41 Chapter 4: Performing Essential Data Manipulations Using Python 59 Chapter 5: Developing a Matrix Computation Class 79 Part 2: Understanding the Need to Sort and Search 97 Chapter 6: Structuring Data 99 Chapter 7: Arranging and Searching Data 117 Part 3: Exploring the World of Graphs 139 Chapter 8: Understanding Graph Basics 141 Chapter 9: Reconnecting the Dots 161 Chapter 10: Discovering Graph Secrets 195 Chapter 11: Getting the Right Web page 207 Part 4: Wrangling Big Data 223 Chapter 12: Managing Big Data 225 Chapter 13: Parallelizing Operations 249 Chapter 14: Compressing and Concealing Data 267 Part 5: Challenging Difficult Problems 289 Chapter 15: Working with Greedy Algorithms 291 Chapter 16: Relying on Dynamic Programming 307 Chapter 17: Using Randomized Algorithms 331 Chapter 18: Performing Local Search 349 Chapter 19: Employing Linear Programming 367 Chapter 20: Considering Heuristics 381 Part 6: The Part of Tens 401 Chapter 21: Ten Algorithms That Are Changing the World 403 Chapter 22: Ten Algorithmic Problems Yet to Solve 411 Index 417 ntroduction 1 Part 1: Getting Started with Algorithms 7 Chapter 1: Introducing Algorithms 9 Chapter 2: Considering Algorithm Design 23 Chapter 3: Working with Google Colab 41 Chapter 4: Performing Essential Data Manipulations Using Python 59 Chapter 5: Developing a Matrix Computation Class 79 Part 2: Understanding the Need to Sort and Search 97 Chapter 6: Structuring Data 99 Chapter 7: Arranging and Searching Data 117 Part 3: Exploring the World of Graphs 139 Chapter 8: Understanding Graph Basics 141 Chapter 9: Reconnecting the Dots 161 Chapter 10: Discovering Graph Secrets 195 Chapter 11: Getting the Right Web page 207 Part 4: Wrangling Big Data 223 Chapter 12: Managing Big Data 225 Chapter 13: Parallelizing Operations 249 Chapter 14: Compressing and Concealing Data 267 Part 5: Challenging Difficult Problems 289 Chapter 15: Working with Greedy Algorithms 291 Chapter 16: Relying on Dynamic Programming 307 Chapter 17: Using Randomized Algorithms 331 Chapter 18: Performing Local Search 349 Chapter 19: Employing Linear Programming 367 Chapter 20: Considering Heuristics 381 Part 6: The Part of Tens 401 Chapter 21: Ten Algorithms That Are Changing the World 403 Chapter 22: Ten Algorithmic Problems Yet to Solve 411 Index 417
£18.39
Palgrave Macmillan Cognitive Internet of Things Collaboration to
Book SynopsisTable of Contents1. Introduction2. What is a Cognitive Device?3. Cognitive Devices as Human Assistants4. Cognitive Things in an Organization5. Reuse and Monetization6. Intelligent Observations7. Organization of Knowledge and Problem Solving8. Installation, Training, Maintenance, Security, and Infrastructure9. Machine-to-Machine Interfaces10. Man-to-Machine Interfaces11. Assisting in Human Communications12. Balance of Power and Societal Impacts
£28.12
Taylor & Francis Ltd Toward Deep Neural Networks
Book SynopsisToward Deep Neural Networks: WASD Neuronet Models, Algorithms, and Applications introduces the outlook and extension toward deep neural networks, with a focus on the weights-and-structure determination (WASD) algorithm. Based on the authorsâ 20 years of research experience on neuronets, the book explores the models, algorithms, and applications of the WASD neuronet, and allows reader to extend the techniques in the book to solve scientific and engineering problems. The book will be of interest to engineers, senior undergraduates, postgraduates, and researchers in the fields of neuronets, computer mathematics, computer science, artificial intelligence, numerical algorithms, optimization, simulation and modeling, deep learning, and data mining. Features Focuses on neuronet models, algorithms, and applications Designs, constructs, develops, analyzes, simulates and compares various WASD neuronet models, such as singTrade ReviewThe book is appealing for graduate students as well as academic and industrial researchers. Based on the comprehensive and systematic research of artificial neural network, especially conventional artificial neural network, the book solves the difficult problem of WASD (weights and structure determination). The book may generate curiosity and also happiness to its readers for learning more in the fields and the researches. - Professor Jinde Cao, Southeast University, Nanjing, China Table of ContentsI Single-Input-Single-Output Neuronet 1 Single-Input Euler-PolynomialWASD Neuronet 2 Single-Input Bernoulli-PolynomialWASD Neuronet 3 Single-Input Laguerre-PolynomialWASD Neuronet II Two-Input-Single-Output Neuronet 4 Two-Input Legendre-PolynomialWASD Neuronet 5 Two-Input Chebyshev-Polynomial-of-Class-1WASD Neuronet 6 Two-Input Chebyshev-Polynomial-of-Class-2WASD Neuronet III Three-Input-Single-Output Neuronet 7 Three-Input Euler-PolynomialWASD Neuronet 8 Three-Input Power-ActivationWASD Neuronet IV General Multi-Input Neuronet 9 Multi-Input Euler-PolynomialWASD Neuronet 10 Multi-Input Bernoulli-PolynomialWASD Neuronet 11 Multi-Input Hermite-PolynomialWASD Neuronet 12 Multi-Input Sine-ActivationWASD Neuronet V Population Applications Using Chebyshev-Activation Neuronet 13 Application to Asian Population Prediction 14 Application to European Population Prediction 15 Application to Oceania Population Prediction 16 Application to Northern American Population Prediction 17 Application to Indian Subcontinent Population Prediction 18 Application toWorld Population Prediction VI Population Applications Using Power-Activation Neuronet 19 Application to Russian Population Prediction 20 WASD Neuronet versus BP Neuronet Applied to Russia Population Prediction 21 Application to Chinese Population Prediction 22 WASD Neuronet versus BP Neuronet Applied to Chinese Population Prediction VII Other Applications 23 Application to USPD Prediction 24 Application to Time Series Prediction 25 Application to GFR Estimation
£117.00