Página 19 dos resultados de 10730 itens digitais encontrados em 0.057 segundos

Applied stochastic Eigen-analysis

Nadakuditi, Rajesh Rao
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 200, [3] leaves
Português
Relevância na Pesquisa
467.05996%
The first part of the dissertation investigates the application of the theory of large random matrices to high-dimensional inference problems when the samples are drawn from a multivariate normal distribution. A longstanding problem in sensor array processing is addressed by designing an estimator for the number of signals in white noise that dramatically outperforms that proposed by Wax and Kailath. This methodology is extended to develop new parametric techniques for testing and estimation. Unlike techniques found in the literature, these exhibit robustness to high-dimensionality, sample size constraints and eigenvector misspecification. By interpreting the eigenvalues of the sample covariance matrix as an interacting particle system, the existence of a phase transition phenomenon in the largest ("signal") eigenvalue is derived using heuristic arguments. This exposes a fundamental limit on the identifiability of low-level signals due to sample size constraints when using the sample eigenvalues alone. The analysis is extended to address a problem in sensor array processing, posed by Baggeroer and Cox, on the distribution of the outputs of the Capon-MVDR beamformer when the sample covariance matrix is diagonally loaded.; (cont.) The second part of the dissertation investigates the limiting distribution of the eigenvalues and eigenvectors of a broader class of random matrices. A powerful method is proposed that expands the reach of the theory beyond the special cases of matrices with Gaussian entries; this simultaneously establishes a framework for computational (non-commutative) "free probability" theory. The class of "algebraic" random matrices is defined and the generators of this class are specified. Algebraicity of a random matrix sequence is shown to act as a certificate of the computability of the limiting eigenvalue distribution and...

Field computation and nonpropositional knowledge

MacLennan, Bruce J.
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Relatório
Português
Relevância na Pesquisa
466.3597%
Most current AI technology has been based on propositionally represented theoretical knowledge. It is argued that if AI is to accomplish its goals, especially in the tasks of sensory interpretation and sensorimotor coordination, then it must solve the problem of representing embodied practical knowledge. Biological evidence shows that animals use this knowledge in a way very different form digital computation. This suggests that if these problems are to be solved, then we will need a new breed of computers, which we call field computers. Examples of field computers are: neurocomputers, optical computers, molecular computers, and any kind of massively parallel analog computer. The author claims that the principle characteristic of all these computers is their massive parallelism, but we use this term in a special way. He argues that true massive parallelism comes when the number of processors is so large that it can be considered a continuous quantity. Designing and programming these computers requires a new theory of computation, one version of which is presented in this paper. Described is a universal field computer, that is, a field computer that can emulate any other field computer. It is based on a generalization of Taylor's theorem to continuous dimensional vector spaces. A number of field computations are illustrated...

Unification modulo a partial theory of exponentiation

Kapur, Deepak; Marshall, Andrew; Narendran, Paliath
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 22/12/2010 Português
Relevância na Pesquisa
466.35477%
Modular exponentiation is a common mathematical operation in modern cryptography. This, along with modular multiplication at the base and exponent levels (to different moduli) plays an important role in a large number of key agreement protocols. In our earlier work, we gave many decidability as well as undecidability results for multiple equational theories, involving various properties of modular exponentiation. Here, we consider a partial subtheory focussing only on exponentiation and multiplication operators. Two main results are proved. The first result is positive, namely, that the unification problem for the above theory (in which no additional property is assumed of the multiplication operators) is decidable. The second result is negative: if we assume that the two multiplication operators belong to two different abelian groups, then the unification problem becomes undecidable.; Comment: In Proceedings UNIF 2010, arXiv:1012.4554

Improving circuit miniaturization and its efficiency using Rough Set Theory

Rawat, Sarvesh SS; Mor, Dheeraj Dilip; Kumar, Anugrah; Roy, Sanjiban Shekar; kumar, Rohit
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 10/12/2013 Português
Relevância na Pesquisa
466.53094%
High-speed, accuracy, meticulousness and quick response are notion of the vital necessities for modern digital world. An efficient electronic circuit unswervingly affects the maneuver of the whole system. Different tools are required to unravel different types of engineering tribulations. Improving the efficiency, accuracy and low power consumption in an electronic circuit is always been a bottle neck problem. So the need of circuit miniaturization is always there. It saves a lot of time and power that is wasted in switching of gates, the wiring-crises is reduced, cross-sectional area of chip is reduced, the number of transistors that can implemented in chip is multiplied many folds. Therefore to trounce with this problem we have proposed an Artificial intelligence (AI) based approach that make use of Rough Set Theory for its implementation. Theory of rough set has been proposed by Z Pawlak in the year 1982. Rough set theory is a new mathematical tool which deals with uncertainty and vagueness. Decisions can be generated using rough set theory by reducing the unwanted and superfluous data. We have condensed the number of gates without upsetting the productivity of the given circuit. This paper proposes an approach with the help of rough set theory which basically lessens the number of gates in the circuit...

Quantum Set Theory Extending the Standard Probabilistic Interpretation of Quantum Theory (Extended Abstract)

Ozawa, Masanao
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 29/12/2014 Português
Relevância na Pesquisa
466.53094%
The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality relation for a pair of arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the probabilistic interpretation of quantum theory to define the probability of equality relation for a pair of arbitrary observables. Applications of this new interpretation to measurement theory are discussed briefly.; Comment: In Proceedings QPL 2014, arXiv:1412.8102

Quantum Gauge Field Theory in Cohesive Homotopy Type Theory

Schreiber, Urs; Shulman, Michael
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 31/07/2014 Português
Relevância na Pesquisa
466.53094%
We implement in the formal language of homotopy type theory a new set of axioms called cohesion. Then we indicate how the resulting cohesive homotopy type theory naturally serves as a formal foundation for central concepts in quantum gauge field theory. This is a brief survey of work by the authors developed in detail elsewhere.; Comment: In Proceedings QPL 2012, arXiv:1407.8427

Ground interpolation for the theory of equality

Fuchs, Alexander; Goel, Amit; Grundy, Jim; Krstić, Sava; Tinelli, Cesare
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
466.53094%
Theory interpolation has found several successful applications in model checking. We present a novel method for computing interpolants for ground formulas in the theory of equality. The method produces interpolants from colored congruence graphs representing derivations in that theory. These graphs can be produced by conventional congruence closure algorithms in a straightforward manner. By working with graphs, rather than at the level of individual proof steps, we are able to derive interpolants that are pleasingly simple (conjunctions of Horn clauses) and smaller than those generated by other tools. Our interpolation method can be seen as a theory-specific implementation of a cooperative interpolation game between two provers. We present a generic version of the interpolation game, parametrized by the theory T, and define a general method to extract runs of the game from proofs in T and then generate interpolants from these runs.

Recognizing Speech in a Novel Accent: The Motor Theory of Speech Perception Reframed

Moulin-Frier, Clément; Arbib, M. A.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 20/09/2013 Português
Relevância na Pesquisa
466.53094%
The motor theory of speech perception holds that we perceive the speech of another in terms of a motor representation of that speech. However, when we have learned to recognize a foreign accent, it seems plausible that recognition of a word rarely involves reconstruction of the speech gestures of the speaker rather than the listener. To better assess the motor theory and this observation, we proceed in three stages. Part 1 places the motor theory of speech perception in a larger framework based on our earlier models of the adaptive formation of mirror neurons for grasping, and for viewing extensions of that mirror system as part of a larger system for neuro-linguistic processing, augmented by the present consideration of recognizing speech in a novel accent. Part 2 then offers a novel computational model of how a listener comes to understand the speech of someone speaking the listener's native language with a foreign accent. The core tenet of the model is that the listener uses hypotheses about the word the speaker is currently uttering to update probabilities linking the sound produced by the speaker to phonemes in the native language repertoire of the listener. This, on average, improves the recognition of later words. This model is neutral regarding the nature of the representations it uses (motor vs. auditory). It serve as a reference point for the discussion in Part 3...

The equivalence of the torus and the product of two circles in homotopy type theory

Sojakova, Kristina
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 13/10/2015 Português
Relevância na Pesquisa
466.53094%
Homotopy type theory is a new branch of mathematics which merges insights from abstract homotopy theory and higher category theory with those of logic and type theory. It allows us to represent a variety of mathematical objects as basic type-theoretic constructions, higher inductive types. We present a proof that in homotopy type theory, the torus is equivalent to the product of two circles. This result indicates that the synthetic definition of torus as a higher inductive type is indeed correct.

Inductive types in homotopy type theory

Awodey, Steve; Gambino, Nicola; Sojakova, Kristina
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
466.53094%
Homotopy type theory is an interpretation of Martin-L\"of's constructive type theory into abstract homotopy theory. There results a link between constructive mathematics and algebraic topology, providing topological semantics for intensional systems of type theory as well as a computational approach to algebraic topology via type theory-based proof assistants such as Coq. The present work investigates inductive types in this setting. Modified rules for inductive types, including types of well-founded trees, or W-types, are presented, and the basic homotopical semantics of such types are determined. Proofs of all results have been formally verified by the Coq proof assistant, and the proof scripts for this verification form an essential component of this research.; Comment: 19 pages; v2: added references and acknowledgements, removed appendix with Coq README file, updated URL for Coq files. To appear in the proceedings of LICS 2012

Teaching Wireless Sensor Networks: An Holistic Approach Bridging Theory and Practice at the Master Level

Fischione, Carlo
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 09/10/2013 Português
Relevância na Pesquisa
466.48824%
Wireless Sensor Networks (WSNs) are a new technology that has received a substantial attention from several academic research fields in the last years. There are many applications of WSNs, including environmental monitoring, industrial automation, intelligent transportation systems, healthcare and wellbeing, smart energy, to mention a few. Courses have been introduced both at the PhD and at the Master levels. However, these existing courses focus on particular aspects of WSNs (Networking, or Signal Processing, or Embedded Software), whereas WSNs encompass disciplines traditionally separated in Electrical Engineering and Computer Sciences. This paper gives two original contributions: the essential knowledge that should be brought in a WSNs course is characterized, and a course structure with an harmonious holistic approach is proposed. A method based on both theory and experiments is illustrated for the design of this course, whereby the students have hands-on to implement, understand, and develop in practice the implications of theoretical concepts. Theory and applications are thus considered all together. Ultimately, the objective of this paper is to design a new course, to use innovative hands-on experiments to illustrate the theoretical concepts in the course...

Morphoid Type Theory

McAllester, David
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
467.13195%
Morphoid type theory is a typed foundation for mathematics in which each type is associated with an equality relation in correspondence with the standard notions of isomorphism in mathematics. The main result is an abstraction theorem stating that isomorphic objects are substitutable in well typed contexts. A corollary is "Voldemort's theorem" stating that a non-canonical object, like a point on a circle, or an isomorphism between a finite dimensional vector space and its dual, cannot be named by a well typed expression. Morphoid type theory seems different from the recently developed homotopy type theory (HOTT). Morphoid type theory is classical and extensional while HOTT is constructive and intensional. Morphoid type theory does not involve homotopy theory. Morphoids are technically quite different from the infinity groupoids of HOTT.

Higher-Order Termination: from Kruskal to Computability

Blanqui, Frédéric; Jouannaud, Jean-Pierre; Rubio, Albert
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
466.4467%
Termination is a major question in both logic and computer science. In logic, termination is at the heart of proof theory where it is usually called strong normalization (of cut elimination). In computer science, termination has always been an important issue for showing programs correct. In the early days of logic, strong normalization was usually shown by assigning ordinals to expressions in such a way that eliminating a cut would yield an expression with a smaller ordinal. In the early days of verification, computer scientists used similar ideas, interpreting the arguments of a program call by a natural number, such as their size. Showing the size of the arguments to decrease for each recursive call gives a termination proof of the program, which is however rather weak since it can only yield quite small ordinals. In the sixties, Tait invented a new method for showing cut elimination of natural deduction, based on a predicate over the set of terms, such that the membership of an expression to the predicate implied the strong normalization property for that expression. The predicate being defined by induction on types, or even as a fixpoint, this method could yield much larger ordinals. Later generalized by Girard under the name of reducibility or computability candidates...

A Theory of Computation Based on Quantum Logic (I)

Ying, Mingsheng
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 29/03/2004 Português
Relevância na Pesquisa
466.53094%
The (meta)logic underlying classical theory of computation is Boolean (two-valued) logic. Quantum logic was proposed by Birkhoff and von Neumann as a logic of quantum mechanics more than sixty years ago. The major difference between Boolean logic and quantum logic is that the latter does not enjoy distributivity in general. The rapid development of quantum computation in recent years stimulates us to establish a theory of computation based on quantum logic. The present paper is the first step toward such a new theory and it focuses on the simplest models of computation, namely finite automata. It is found that the universal validity of many properties of automata depend heavily upon the distributivity of the underlying logic. This indicates that these properties does not universally hold in the realm of quantum logic. On the other hand, we show that a local validity of them can be recovered by imposing a certain commutativity to the (atomic) statements about the automata under consideration. This reveals an essential difference between the classical theory of computation and the computation theory based on quantum logic.

Negotiating over Bundles and Prices Using Aggregate Knowledge

Somefun, Koye; Klos, Tomas; La Poutré, Han
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 23/12/2004 Português
Relevância na Pesquisa
466.4467%
Combining two or more items and selling them as one good, a practice called bundling, can be a very effective strategy for reducing the costs of producing, marketing, and selling goods. In this paper, we consider a form of multi-issue negotiation where a shop negotiates both the contents and the price of bundles of goods with his customers. We present some key insights about, as well as a technique for, locating mutually beneficial alternatives to the bundle currently under negotiation. The essence of our approach lies in combining historical sales data, condensed into aggregate knowledge, with current data about the ongoing negotiation process, to exploit these insights. In particular, when negotiating a given bundle of goods with a customer, the shop analyzes the sequence of the customer's offers to determine the progress in the negotiation process. In addition, it uses aggregate knowledge concerning customers' valuations of goods in general. We show how the shop can use these two sources of data to locate promising alternatives to the current bundle. When the current negotiation's progress slows down, the shop may suggest the most promising of those alternatives and, depending on the customer's response, continue negotiating about the alternative bundle...

Evolutionary Socioeconomics: a Schumpeterian Computer Simulation

Tucci, Michele
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 23/04/2006 Português
Relevância na Pesquisa
466.74277%
The following note contains a computer simulation concerning the struggle between two companies: the first one is "the biggest zaibatsu of all", while the second one is "small, fast, ruthless". The model is based on a neo-Schumpeterian framework operating in a Darwinian evolutionary environment. After running the program a large number of times, two characteristics stand out: -- There is always a winner which takes it all, while the loser disappears. -- The key to success is the ability to employ efficiently the technological innovations. The topic of the present paper is strictly related with the content of the following notes: Michele Tucci, Evolution and Gravitation: a Computer Simulation of a Non-Walrasian Equilibrium Model; Michele Tucci, Oligopolistic Competition in an Evolutionary Environment: a Computer Simulation. The texts can be downloaded respectively at the following addresses: http://arxiv.org/abs/cs.CY/0209017 http://arxiv.org/abs/cs.CY/0501037 These references include some preliminary considerations regarding the comparison between the evolutionary and the gravitational paradigms and the evaluation of approaches belonging to rival schools of economic thought.; Comment: PDF, 15 pages, 5 graphs

A Proposal To Support Wellbeing in People With Borderline Personality Disorder: Applying Reminiscent Theory in a Mobile App

Good, Alice; Wilson, Clare; Ancient, Claire; Sambhanthan, Arunasalam
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 21/02/2013 Português
Relevância na Pesquisa
466.48824%
In this paper the research draws upon reminiscence therapy, which is used in treating dementia, as an applied theory to promote well being in people who experience low moods. The application proposed here aims to promote wellbeing for people suffering from mood disorders and dementia but could potentially be used to enhance wellbeing for many types of users. Use of the application is anticipated to improve mood in a group of users where severe emotional problems are prevalent. The research aims to evaluate the effectiveness of a reminiscence based application in promoting well being in people specifically with Borderline Personality Disorder (BPD). The long term objective of this research is to establish the effectiveness of reminiscence theory on user groups aside from dementia, particularly other mental illnesses. The research advocates involving end users within the design process both to inform and evaluate the development of a mobile and tablet application.; Comment: Conference paper

2006: Celebrating 75 years of AI - History and Outlook: the Next 25 Years

Schmidhuber, Juergen
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 31/08/2007 Português
Relevância na Pesquisa
467.1421%
When Kurt Goedel layed the foundations of theoretical computer science in 1931, he also introduced essential concepts of the theory of Artificial Intelligence (AI). Although much of subsequent AI research has focused on heuristics, which still play a major role in many practical AI applications, in the new millennium AI theory has finally become a full-fledged formal science, with important optimality results for embodied agents living in unknown environments, obtained through a combination of theory a la Goedel and probability theory. Here we look back at important milestones of AI history, mention essential recent results, and speculate about what we may expect from the next 25 years, emphasizing the significance of the ongoing dramatic hardware speedups, and discussing Goedel-inspired, self-referential, self-improving universal problem solvers.; Comment: 14 pages; preprint of invited contribution to the Proceedings of the ``50th Anniversary Summit of Artificial Intelligence'' at Monte Verita, Ascona, Switzerland, 9-14 July 2006

Models and termination of proof-reduction in the $\lambda$$\Pi$-calculus modulo theory

Dowek, Gilles
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 26/01/2015 Português
Relevância na Pesquisa
466.53094%
We define a notion of model for the $\lambda \Pi$-calculus modulo theory, a notion of super-consistent theory, and prove that proof-reduction terminates in the $\lambda \Pi$-calculus modulo a super-consistent theory. We prove this way the termination of proof-reduction in two theories in the $\lambda \Pi$-calculus modulo theory, and their consistency: an embedding of Simple type theory and an embedding of the Calculus of constructions.

The selection monad as a CPS transformation

Hedges, Jules
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 20/03/2015 Português
Relevância na Pesquisa
467.13195%
A computation in the continuation monad returns a final result given a continuation, ie. it is a function with type $(X \to R) \to R$. If we instead return the intermediate result at $X$ then our computation is called a selection function. Selection functions appear in diverse areas of mathematics and computer science (especially game theory, proof theory and topology) but the existing literature does not heavily emphasise the fact that the selection monad is a CPS translation. In particular it has so far gone unnoticed that the selection monad has a call/cc-like operator with interesting similarities and differences to the usual call/cc, which we explore experimentally using Haskell. Selection functions can be used whenever we find the intermediate result more interesting than the final result. For example a SAT solver computes an assignment to a boolean function, and then its continuation decides whether it is a satisfying assignment, and we find the assignment itself more interesting than the fact that it is or is not satisfying. In game theory we find the move chosen by a player more interesting than the outcome that results from that move. The author and collaborators are developing a theory of games in which selection functions are viewed as generalised notions of rationality...