I need best books by best authors on following subjects. Money not a constraint. All subjects are from Computer Science.
If you know whose books for the subjects is the best please comment below the name of the book and Author name. Thanks in Advance.
Mathematical logic – Boolean logic and other ways of modeling logical queries; the uses and limitations of formal proof methods
Number theory – Theory of the integers. Used in cryptography as well as a test domain in artificial intelligence.
Graph theory – Foundations for data structures and searching algorithms.
Game theory – Useful in artificial intelligence and cybernetics.
Coding theory – Useful in networking and other areas where computers communicate with each other.
Algorithms and data structures
Algorithms – Sequential and parallel computational procedures for solving a wide range of problems.
Data structures – The organization and manipulation of data.
Artificial intelligence – The implementation and study of systems that exhibit an autonomous intelligence or behavior of their own.
Automated reasoning – Solving engines, such as used in Prolog, which produce steps to a result given a query on a fact and rule database, and automated theorem provers that aim to prove mathematical theorems with some assistance from a programmer.
Robotics – Algorithms for controlling the behavior of robots.
Computer vision – Algorithms for identifying three dimensional objects from a two dimensional picture.
Machine learning – Automated creation of a set of rules and axioms based on input.
Virtual Organization – Including virtual enterprises, extended enterprises.
VO Breeding Environment – Including industry clusters, industrial districts, digital business ecosystems.
Professional virtual community – Including virtual teams, social networks.
Communications and Security
Networking – Algorithms and protocols for reliably communicating data across different shared or dedicated media, often including error correction.
Computer security – Practical aspects of securing computer systems and computer networks.
Cryptography – Applies results from complexity, probability and number theory to invent and break codes, and analyze the security of cryptographic protocols.
Computer architecture – The design, organization, optimization and verification of a computer system, mostly about CPUs and Memory subsystem (and the bus connecting them).
Operating systems – Systems for managing computer programs and providing the basis of a usable system.
Computer graphics – Algorithms both for generating visual images synthetically, and for integrating or altering visual and spatial information sampled from the real world.
Image processing – Determining information from an image through computation.
Human computer interaction – The study and design of computer interfaces that people use.
Concurrent, parallel, and distributed systems
Concurrency – The theory and practice of simultaneous computation; data safety in any multitasking or multithreaded environment.
Parallel computing – Computing using multiple concurrent threads of execution, devising algorithms for solving problems on multiple processors to achieve maximal speed-up compared to sequential execution.
Distributed computing – Computing using multiple computing devices over a network to accomplish a common objective or task and thereby reducing the latency involved in single processor contributions for any task.
Relational databases – the set theoretic and algorithmic foundation of databases.
Data mining – Study of algorithms for searching and processing information in documents and databases; closely related to information retrieval.
Programming languages and compilers
Compiler theory – Theory of compiler design, based on Automata theory.
Programming language pragmatics – Taxonomy of programming languages, their strength and weaknesses. Various programming paradigms, such as object-oriented programming.
Programming language theory
Formal semantics – rigorous mathematical study of the meaning of programs.
Type theory – Formal analysis of the types of data, and the use of these types to understand properties of programs — especially program safety.
Computational science – constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems.
Numerical analysis – Approximate numerical solution of mathematical problems such as root-finding, integration, the solution of ordinary differential equations; the approximation of special functions.
Formal methods – Mathematical approaches for describing and reasoning about software designs.
Software engineering – The principles and practice of designing, developing, and testing programs, as well as proper engineering practices.
Reverse engineering – The application of the scientific method to the understanding of arbitrary existing software
Algorithm design – Using ideas from algorithm theory to creatively design solutions to real tasks
Computer programming – The practice of using a programming language to implement algorithms
Theory of computation
Automata theory – Different logical structures for solving problems.
Computability theory – What is calculable with the current models of computers. Proofs developed by Alan Turing and others provide insight into the possibilities of what may be computed and what may not.
Computational complexity theory – Fundamental bounds (especially time and storage space) on classes of computations.
Quantum computing theory – Explores computational models involving quantum superposition of bits.