Introduction to the Theory of Complex Systems 1st edition by Stefan Thurner, Rudolf Hanel, Peter Klimek – Ebook PDF Instant Download/DeliveryISBN: 0192555073 9780192555076
Full download Introduction to the Theory of Complex Systems 1st edition after payment.
Product details:
ISBN-10 : 0192555073
ISBN-13 : 9780192555076
Author : Stefan Thurner, Rudolf Hanel, Peter Klimek
This book is a comprehensive introduction to quantitative approaches to complex adaptive systems. Practically all areas of life on this planet are constantly confronted with complex systems, be it ecosystems, societies, traffic, financial markets, opinion formation and spreading, or the internet and social media. Complex systems are systems composed of many elements that interact strongly with each other, which makes them extremely rich dynamical systems showing a huge range of phenomena. Properties of complex systems that are of particular importance are their efficiency, robustness, resilience, and proneness to collapse. The quantitative tools and concepts needed to understand the co-evolutionary nature of networked systems and their properties are challenging. The book gives a self-contained introduction to these concepts, so that the reader will be equipped with a toolset that allows them to engage in the science of complex systems. Topics covered include random processes of path-dependent processes, co-evolutionary dynamics, dynamics of networks, the theory of scaling, and approaches from statistical mechanics and information theory. The book extends beyond the early classical literature in the field of complex systems and summarizes the methodological progress made over the past 20 years in a clear, structured, and comprehensive way.
Introduction to the Theory of Complex Systems 1st Table of contents:
Chapter 1: Introduction to Complex Systems
1.1 Physics,biology,or social science?
1.2 Components from physics
1.2.1 The nature of the fundamental forces
1.2.2 What does predictive mean?
1.2.3 Statistical mechanics—predictability on stochastic grounds
1.2.4 The evolution of the concept of predictability in physics
1.2.5 Physics is analytic, complex systems are algorithmic
1.2.6 What are complex systems from a physics point of view?
1.2.7 A note on chemistry—the science of equilibria
1.3 Components from the life sciences
1.3.1 Chemistry of small systems
1.3.2 Biological interactions happen on networks—almost exclusively
1.3.3 What is evolution?
1.3.3.1 Evolution is not physics
1.3.3.2 The concept of the adjacent possible
1.3.3.3 Summary evolutionary processes
1.3.4 Adaptive and robust—the concept of the edge of chaos
1.3.4.1 How does nature find the edge of chaos?
1.3.4.2 Intuition behind self-organized critical systems—sand pile models
1.3.5 Components taken from the life sciences
1.4 Components from the social sciences
1.4.1 Social systems are continuously restructuring networks
1.5 What are Complex Systems?
1.5.1 What is co-evolution?
1.5.2 The role of the computer
1.6 The structure of the book
1.6.1 What has complexity science contributed to the history of science?
Chapter 2: Probability and Random Processes
2.1 Overview
2.1.1 Basic concepts and notions
2.1.1.1 Trials, odds, chances, and probability
2.1.1.2 Experiments, random variables, and sample space
2.1.1.3 Systems and processes
2.1.1.4 Urns
2.1.1.5 Probability and distribution functions
2.1.2 Probability and information
2.1.2.1 Why are random processes important?
2.2 Probability
2.2.1 Basic probability measures and the Kolmogorov axioms
2.2.2 Histograms and relative frequencies
2.2.3 Mean, variance, and higher moments
2.2.4 More than one random variable
2.2.4.1 Comparing random variables
2.2.4.2 Joint and conditional probabilities
2.2.5 A note on Bayesian reasoning
2.2.6 Bayesian and frequentist thinking
2.2.6.1 Hypothesis testing in the frequentist approach
2.3 The law of large numbers—adding random numbers
2.3.1 The central limit theorem
2.3.1.1 Gaussian distribution function
2.3.1.2 The convolution product
2.3.1.3 Log-normal distribution function
2.3.2 Generalized limit theorems and α-stable processes
2.3.2.1 Distribution functions of α-stable processes
2.3.2.2 Gaussian distribution function
2.3.2.3 Cauchy distribution function
2.3.2.4 Lévy distribution function
2.3.2.5 Lévy flights
2.4 Fat-tailed distribution functions
2.4.1 Distribution functions that show power law tails
2.4.1.1 Pareto distribution
2.4.1.2 Zipf distribution
2.4.1.3 Zipf–Mandelbrot distribution
2.4.1.4 The q-exponential distribution function—Tsallis distribution
2.4.1.5 Student-t distribution
2.4.2 Other distribution functions
2.4.2.1 Exponential -or Boltzmann distribution
2.4.2.2 Stretched exponential distribution function
2.4.2.3 Gumbel distribution
2.4.2.4 Weibull distribution
2.4.2.5 Gamma distribution
2.4.2.6 Gompertz distribution
2.4.2.7 Generalized exponential distribution or Lambert-W exponential distribution
2.5 Stochastic processes
2.5.1 Simple stochastic processes
2.5.1.1 Bernoulli processes
2.5.1.2 Binomial distribution functions
2.5.1.3 Multinomial processes
2.5.1.4 Poisson processes
2.5.1.5 Markov processes
2.5.2 History- or path-dependent processes
2.5.3 Reinforcement processes
2.5.4 Driven dissipative systems
2.5.4.1 Processes with dynamical sample spaces
2.6 Summary
2.7 Problems
Chapter 3: Scaling
3.1 Overview
3.1.1 Definition of scaling
3.2 Examples of scaling laws in statistical systems
3.2.1 A note on notation for distribution functions
3.2.1.1 Processes with natural order
3.2.1.2 Rank-ordered processes
3.2.1.3 Frequency distributions
3.3 Origins of scaling
3.3.1 Criticality
3.3.1.1 Critical exponents
3.3.1.2 Universality
3.3.1.3 Percolation
3.3.2 Self-organized criticality
3.3.3 Multiplicative processes
3.3.4 Preferential processes
3.3.5 Sample space reducing processes
3.3.5.1 Zipf’s law emerges
3.3.5.2 The influence of the driving rate
3.3.5.3 State-dependent driving rates and the emergence of statistics
3.3.5.4 Cascading SSR processes
3.3.5.5 Examples of SSR processes
3.3.6 Other mechanisms
3.3.6.1 Exponential growth with random exponentially distributed observation times
3.3.6.2 Random typewriting
3.3.6.3 Information-theoretic context
3.4 Power laws and how to measure them
3.4.1 Maximum likelihood estimator for power law exponents λ <−1
3.4.2 Maximum likelihood estimator for power laws for all exponents
3.5 Scaling in space—symmetry of non-symmetric objects, fractals
3.5.1 Self similarity and scale invariance
3.5.2 Scaling in space: fractals
3.5.2.1 Quantification of fractals, fractal dimension
3.5.2.2 Measuring fractal dimension—box counting
3.5.3 Scalingin time—fractal time series
3.6 Example—understanding allometric scaling in biology
3.6.1 Understanding the 3/4 power law
3.7 Summary
3.8 Problems
Chapter 4: Networks
4.1 Overview
4.1.1 Historical origin of network science
4.1.2 From random matrix theory to random networks
4.1.3 Small worlds and power laws
4.1.4 Networks in the big data era
4.2 Network basics
4.2.1 Networks or graphs?
4.2.2 Nodes and links
4.2.3 Adjacency matrix of undirected networks
4.2.3.1 The adjacency matrix of directed networks
4.2.3.2 Bipartite networks
4.2.3.3 The incidence matrix
4.2.3.4 Multigraphs and hypergraphs
4.2.3.5 Network duals and the Laplacian matrix
4.3 Measures on networks
4.3.1 Degree of a node
4.3.1.1 Degree, in-degree, and out-degree
4.3.1.2 Degree distribution
4.3.1.3 Nearest-neighbour degrees and assortativity
4.3.1.4 Connectivity and connectancy
4.3.2 Walking on networks
4.3.2.1 Walks and paths
4.3.2.2 Circuits and cycles
4.3.3 Connectedness and components
4.3.3.1 Components of networks
4.3.3.2 Trees and forests
4.3.3.3 Bow tie structure of directed networks
4.3.4 From distances on networks to centrality
4.3.4.1 Geodesic paths, diameter, and characteristic distances
4.3.4.2 Closeness and betweenness centrality
4.3.5 Clustering coefficient
4.3.5.1 From cycles and paths to clustering
4.3.5.2 Individual clustering coefficient
4.3.5.3 Overall and average clustering
4.3.5.4 Clustering in directed networks
4.4 Random networks
4.4.1 Three sources of randomness
4.4.1.1 Is a random network really a network?
4.4.2 Erd˝os–Rényi networks
4.4.2.1 Definition of the Erd˝os–Rényi ensemble
4.4.2.2 Degree distribution
4.4.2.3 Moments of the degree distribution
4.4.2.4 Clustering coefficient
4.4.3 Phase transitions in Erd˝os–Rényi networks
4.4.3.1 The percolation transition of Erd˝os–Rényi networks
4.4.3.2 Size-dependent threshold functions
4.4.3.3 Important thresholds for Erd˝os–Rényi networks
4.4.3.4 Giant components
4.4.4 Eigenvalue spectra of random networks
4.4.4.1 Perron–Frobenius theorem
4.4.4.2 Wigner’s semicircle law
4.4.4.3 Girko’s circular law
4.4.4.4 Wigner’s and Girko’sl aw for random networks
4.5 Beyond Erd˝os–Rényi—complex networks
4.5.1 The configuration model
4.5.1.1 Rewiring in the configuration model
4.5.1.2 Rewiring to erase topological information
4.5.1.3 The friendship paradox
4.5.2 Network superposition model
4.5.2.1 Superstatistics of Erd˝os–Rényi networks
4.5.3 Small worlds
4.5.3.1 Six degrees of separation
4.5.3.2 High clustering and low characteristic distance
4.5.3.3 Watts–Strogatz model
4.5.4 Hubs and scale-free networks
4.5.4.1 Scale-free degree distributions
4.5.4.2 Preferential attachment
4.5.4.3 Barabási–Albert model for growing scale-free networks
4.5.4.4 Degree distribution of the Barabási–Albert model
4.5.4.5 Other properties of the Barabási–Albert model
4.5.4.6 Hubs in non-growing networks
4.5.4.7 Preferential attachment in practice
4.6 Communities
4.6.1 Graph partitioning and minimum cuts
4.6.2 Hierarchical clustering
4.6.3 Divisive clustering in the Girvan–Newman algorithm
4.6.4 Modularity optimization
4.6.4.1 Girvan–Newman algorithm
4.6.4.2 Louvain method
4.6.4.3 Extremal optimization
4.6.4.4 Spectral method
4.6.4.5 Which algorithm is the right one for my problem?
4.7 Functional networks—correlation network analysis
4.7.1 Construction of correlation networks
4.7.1.1 Correlation between binary variables
4.7.1.2 Correlation between continuous variables
4.7.1.3 Correlation between a binary and a continuous variable
4.7.2 Filtering the correlation network
4.7.2.1 Naive global thresholding
4.7.2.2 Multiple hypothesis correction
4.7.2.3 The maximum spanning tree
4.7.2.4 Network backboning with the disparity filter
4.7.2.5 How should I choose my p, α, and Q?
4.8 Dynamics on and of networks—from diffusion to co-evolution
4.8.1 Diffusionon networks
4.8.2 Laplacian diffusion on networks
4.8.3 Eigenvector centrality
4.8.4 Katzprestige
4.8.5 PageRank
4.8.6 Contagion dynamic sand epidemic spreading
4.8.6.1 Epidemic spreading on networks in the SI model
4.8.6.2 SImodelwith immunization
4.8.6.3 Epidemic spreading on networks in the SIS model
4.8.6.4 SIS model with immunization
4.8.6.5 Generalized epidemic spreading, SIR models
4.8.7 Co-evolving spreading models—adaptive networks
4.8.8 Simple models for social dynamics
4.8.8.1 Voter model for opinion formation
4.8.8.2 Co-evolving voter model
4.9 Generalized networks
4.9.1 Hypergraphs
4.9.2 Power graphs
4.9.2.1 Bipartite representation of power graphs
4.9.3 Multiplex networks
4.9.3.1 Adjacency tensors
4.9.4 Multilayer networks
4.10 Example—systemic risk in financial networks
4.10.1 Quantification of systemic risk
4.10.1.1 DebtRank—how important are banks systemically?
4.10.1.2 Expected systemic loss
4.10.1.3 Systemic risk of individual transactions
4.10.2 Management of systemic risk
4.11 Summary
4.12 Problems
Chapter 5: Evolutionary Processes
5.1 Overview
5.1.1 Science of evolution
5.1.2 Evolution as analgorithmic three-step process
5.1.3 What can be expected froma science of evolution?
5.2 Evidence for complex dynamics in evolutionary processes
5.2.1 Criticality, punctuated equilibria, and the abundance of fat-tailed statistics
5.2.1.1 Evidence from the fossil record
5.2.1.2 Evidence from economics
5.2.1.3 Open-endedness
5.2.2 Evidence for combinatorial co-evolution
5.2.2.1 Evidence from evolutionary biology
5.2.2.2 Evidence fromt echnological development
5.2.2.3 Evidence from economic development
5.3 From simple evolution models to a general evolution algorithm
5.3.1 Traditional approaches to evolution—the replicator equation
5.3.1.1 Replication
5.3.1.2 Competition
5.3.1.3 Mutation
5.3.1.4 Combination
5.3.2 Limits to the traditional approach
5.3.2.1 Size and initial conditions
5.3.2.2 Well-stirred reactor
5.3.2.3 Open-endedness
5.3.2.4 Co-evolution and separation of timescales
5.3.3 Towards a general evolution algorithm
5.3.3.1 Co-evolution between species and their interactions
5.3.4 General evolution algorithm
5.4 What is fitness?
5.4.1 Fitnessl andscapes?
5.4.1.1 Geometrical interpretation of fitness functions—fitness landscapes
5.4.2 Simple fitness landscape models
5.4.2.1 Static fitness functions: Lotka–Volterra dynamics
5.4.3 Evolutionary dynamics on fitnessl andscapes
5.4.3.1 NK model
5.4.3.2 Dynamic so fthe NK model
5.4.3.3 NKCS model
5.4.3.4 NK models and spin models
5.4.3.5 NK models for technological change: Wright’s law of technological progress
5.4.3.6 Trivial case of single-component technologies
5.4.3.7 Technological complexity—design structure matrix
5.4.4 Co-evolving fitness landscapes—the Bak–Sneppen model
5.4.4.1 Dynamics of the Bak–Sneppen model
5.4.4.2 Self-organized criticality and punctuatede quilibria
5.4.5 The adjacent possible in fitness landscape models
5.5 Linear evolution models
5.5.1 Emergence of auto-catalyticsets—the ain–Krishna model
5.5.1.1 Catalytic reactions of polymers and the origin of life
5.5.1.2 Linea rreaction dynamics—the fast timescale
5.5.1.3 Extinction dynamics—the long timescale
5.5.1.4 Combined dynamics—the Jain–Krishna model
5.5.1.5 Diversity, eigenvalues, and auto-catalytic cycles
5..5.1.6 Understanding the emergence of diversity
5.5.1.7 Collapse—understanding mass extinctions
5.5.1.8 Introducing competition—emergence of cooperation
5.5.2 Sequentially linear models and the edge of chaos
5.5.2.1 Aminimally non-linear evolution model
5.5.2.2 A simple example
5.5.2.3 Sequentially linear modelsandtheedge of chaos
5.5.3 Systemic risk in evolutionary systems—modelling collapse
5.5.3.1 Extinction in the Solé–Manrubia model
5.5.3.2 Financial networks and cascading failure
5.6 Non-linear evolution models—combinatorial evolution
5.6.1 Schumpeter got it right
5.6.2 Generic creative phase transition
5.6.2.1 Quantifying the adjacent possible
5.6.2.2 The generic phase transition in combinatorial evolutionary systems
5.6.3 Arthur–Polak model of technological evolution
5.6.4 The open-ended co-evolving combinatorialcriticalmodel—CCC model
5.6.4.1 Combinatorial interactions in the CCC model
5.6.4.2 Critical behaviour in the CCC model
5.6.4.3 Co-evolution of species and fitness landscapes in the CCC model
5.6.5 CCC model in relation to other evolutionary models
5.6.5.1 Creative phase transition of random catalytic sets
5.6.5.2 Linear interaction models
5.6.5.3 NK models
5.7 Examples—evolutionary models for economic predictions
5.7.1 Estimation of fitness of countries from economic data
5.7.1.1 Product diversity and economic growth
5.7.1.2 Two regimes in economic development
5.7.2 Predicting product diversity from data
5.7.2.1 The multicountry CCC model
5.7.2.2 Calibration of the model
5.8 Summary
5.9 Problems
Chapter 6: Statistical Mechanics and Information Theory for Complex Systems
6.1 Overview
6.1.1 The three faces of entropy
6.1.1.1 Physics and entropy
6.1.1.2 Information theory
6.1.1.3 Maximum entropy principle
6.1.1.4 Trinity of entropy for simple systems:three-in-one
6.1.1.5 Entropy and equilibrium
6.1.1.6 Entropy for complex systems—the challenges and promises
6.2 Classical notions of entropy for simple systems
6.2.1 Entropy and physics
6.2.1.1 Example—average values and the most likely configuration
6.2.1.2 Composing systems—additivity and extensivity
6.2.1.3 Entropy and thermodynamic relations
6.2.1.4 The Boltzmann distribution
6.2.1.5 Other entropies in physics
6.2.2 Entropy and information
6.2.2.1 A simple example
6.2.2.2 Three reasons why Shannon entropy is a good measure of uncertainty
6.2.2.3 Sources and information production
6.2.2.4 Shannon’s noisy channel coding theorem
6.2.2.5 Information production of a source
6.2.2.6 Markov processes and ergodicity
6.2.2.7 Source properties
6.2.2.8 A note on information production and coding
6.2.2.9 Entropy rate of emitted sequences
6.2.2.10 Shannon’s axioms of information theory
6.2.3 Entropy and statistical inference
6.2.3.1 The maximum entropy principle
6.2.4 Limits of the classical entropy concept
6.3 Entropy for complex systems
6.3.1 Complex systems violate ergodicity
6.3.1.1 Violation of the composition axiom
6.3.2 Shannon–Khinchin axioms for complex systems
6.3.3 Entropy for complex systems
6.3.4.3 Anteonodo–Plastino entropy
6.3.4 Special cases
6.3.4.1 Classical entropy of Boltzmann, Gibbs, Shannon, and Jaynes
6.3.4.2 Tsallis entropy
6.3.4.3 Anteonodo–Plastino entropy
6.3.5 Classification of complex systems based on their entropy
6.3.5.1 Equivalence classes—when are two entropies equivalent?
6.3.5.2 Non-traceformentropies—anoteonRényi entropy
6.3.6 Distribution functions from the complex systems entropy
6.3.7 Consequences for entropy when giving up ergodicity
6.3.8 Systems that violate more than the composition axiom
6.4 Entropy and phasespace for physical complex systems
6.4.1 Requirement of extensivity
6.4.1.1 Imposing the requirement of extensivity
6.4.2 Phasespace volume and entropy
6.4.2.1 How phasespace determines entropy
6.4.2.2 Simple examples
6.4.2.3 A formula for phasespace volume as a function of system size
6.4.2.4 Subexponential phasespace growth and stronglyconstrained processes
6.4.3 Some examples
6.4.3.1 Ageing random walks
6.4.3.2 A social network model: join-a-club spin system
6.4.3.3 Black hole entropy
6.4.4 What does non-exponential phasespace growth imply?
6.5 Maximum entropy principle for complex systems
6.5.1 Path-dependent processes and multivariate distributions
6.5.2 When does a maximum entropy principle exist forpath-dependent processes?
6.5.2.1 Towards a generalized maximum entropy principle
6.5.2.2 If a factorization exists—what then?
6.5.2.3 Deriving the maximum entropy for path-dependent processes
6.5.2.4 The most likely distribution function for path-dependent processes
6.5.3 Example—maximum entropy principle forpath-dependent random walks
6.6 The three faces of entropy revisited
6.6.1 The three entropies of the Pólya urn process
6.6.1.1 The Pólya urn process
6.6.1.2 The information production entropy of the Pólya urn process
6.6.1.3 The extensive entropy of the Pólya urn process
6.6.1.4 The maximum entropy of the Pólya urn process
6.6.2 The three entropies of sample space reducing processes
6.6.2.1 The SSR process
6.6.2.2 The entropy production rate of the SSR process
6.6.2.3 The extensive entropy of the SSR process
6.6.2.4 The max entropy of the SSR process
6.7 Summary
6.8 Problems
Chapter 7: The Future of the Scienceof Complex Systems?
Chapter 8: Special Functions and Approximations
8.1 Special functions
8.1.1 Heaviside step function
8.1.2 Dirac delta function
8.1.3 Kronecker delta
8.1.4 The Lambert-W function
8.1.5 Gamma function
8.1.6 Incomplete Gamma function
8.1.7 Deformed factorial
8.1.8 Deformed multinomial
8.1.9 Generalized logarithm
8.1.10 Pearson correlation coefficient
8.1.11 Chi-squared distribution
8.2 Approximations
8.2.1 Stirling’s formula
8.2.2 Expressing the exponential function as a power
8.3 Problems
People also search for Introduction to the Theory of Complex Systems 1st:
what is complex system theory
introduction to systems theory
introduction to complex systems pdf
introduction to the theory of functions of a complex variable
introduction to the modeling and analysis of complex systems
Tags:
Introduction,Theory of Complex Systems,Stefan Thurner,Rudolf Hanel,Peter Klimek