Página 25 dos resultados de 2566 itens digitais encontrados em 0.041 segundos

Gestão e trabalhadores do conhecimento em Tecnologias da informação (UML)

Borges, Luís Miguel Freire Machado
Fonte: Repositório Científico Lusófona Publicador: Repositório Científico Lusófona
Tipo: Dissertação de Mestrado
Português
Relevância na Pesquisa
36.94791%
RESUMO: O conhecimento existe desde sempre, mesmo num estado latente condicionado algures e apenas à espera de um meio (de uma oportunidade) de se poder manifestar. O conhecimento é duplamente um fenómeno da consciência: porque dela procede num dado momento da sua vida e da sua história e porque só nela termina, aperfeiçoando-a e enriquecendo-a. O conhecimento está assim em constante mudança. À relativamente pouco tempo começou-se a falar de Gestão do Conhecimento e na altura foi muito associada às Tecnologias da Informação, como meio de colectar, processar e armazenar cada vez mais, maiores quantidades de informação. As Tecnologias da Informação têm tido, desde alguns anos para cá, um papel extremamente importante nas organizações, inicialmente foram adoptadas com o propósito de automatizar os processos operacionais das organizações, que suportam as suas actividades quotidianas e nestes últimos tempos as Tecnologias da Informação dentro das organizações têm evoluído rapidamente. Todo o conhecimento, mesmo até o menos relevante de uma determinada área de negócio, é fundamental para apoiar o processo de tomada de decisão. As organizações para atingirem melhores «performances» e conseguirem transcender as metas a que se propuseram inicialmente...

Rede de sensores para engenharia biomédica utilizando o protocolo IEEE1451.; Sensors network for biomedical engineering using IEEE1451 protocol.

Becari, Wesley
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 30/01/2012 Português
Relevância na Pesquisa
36.947598%
A utilização de sensores e de atuadores cresceu vertiginosamente nos últimos anos. As aplicações centralizadas em sensoriamento e controle avançaram com a instrumentação industrial, passando pela incorporação desses elementos em redes distribuídas até culminar, na atualidade, em redes integradas que possuem inúmeras funções e aplicações, dentre elas: controle, monitoramento, rastreamento e segurança. Entretanto o crescimento do número de sensores e atuadores conectados através de barramentos e redes não ocorreu de forma única, proliferando uma diversidade de formas de padronização na comunicação entre esses e seus respectivos monitores ou controladores. Dessa pluralidade de protocolos emergiu a necessidade de criação de um padrão que permitisse interoperabilidade entre transdutores e redes de controle, bem como a introdução do conceito de sensores e atuadores inteligentes. Nesse contexto foi proposto o protocolo IEEE1451 (Standards for Smart Transducer Interface for Sensors and Actuators). Nessa perspectiva o trabalho em questão apresenta os resultados do desenvolvimento e a utilização desse padrão em duas aplicações de engenharia biomédica. Primeiramente em um sistema embarcado capaz de realizar aquisição e processamento de biopotenciais...

Bimetallic bars with local control of composition by three-dimensional printing

Techapiesancharoenkij, Ratchatee, 1979-
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 112 p.; 1551479 bytes; 1998869 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
Three Dimensional Printing (3DP) is a process that enables the fabrication of geometrically complex parts directly from computer-aided design (CAD) models. The success of 3DP as an alternative manufacturing technology to bulk machining of materials for complex parts has been demonstrated. By proof of concept, 3DP has demonstrated the ability to create parts with Local Control of the Composition (LCC). LCC allows tailoring the material properties in regions of a part for functional purposes. In this work, LCC was studied and demonstrated by fabricating bimetallic bars consisting of two layers of Fe-Ni alloys with different composition and, hence, different thermal expansion properties; the coefficient of thermal expansion (CTE) of Fe-Ni system is sensitive to its composition. Two types of the binder/dopant slurries were made for making the LCC bars. One type consisted of dispersions of Fe₂O₃ particles in water, and the other consisted of dispersion of NiO in water. The LCC bars were successfully made by printing the Fe₂O₃/NiO slurries into Fe-30Ni base powders. After heat treatment to impart strength to the printed bars, the bars were successfully retrieved from unbound powders. The bars, then, were annealed at 1400 ⁰C for 2 hours for sintering and homogenization. The final composition of the base powders were changed accordingly. In the layers on which an Fe₂O₃ slurry was printed...

Vector drop-on-demand production of tungsten carbide-cobalt tooling inserts by three dimensional printing

Guo, David, 1976-
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 120 p.; 10279604 bytes; 10294208 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
Three Dimensional Printing (3DP) is a solid freeform fabrication process used to generate solid parts directly from three-dimensional computer models. A part geometry is created by selectively depositing binder into sequentially spread layers of powder. In slurry-based 3DP, a suspension of powder in a solvent is used to form the powderbed layer. This slurry-based powderbed yields higher green density and part resolution than dry powder-based 3DP because of smaller particle size. Vector printing requires that the printhead trace and define the external geometries of a part before raster filling the interior, a new approach in comparison to conventional, raster-only printing. Drop-on-demand (DOD) printheads allow binder droplets to be ejected when needed rather than relying upon charge-and-deflect mechanisms used in continuous jet printheads. Integrating these concepts for vector, DOD printing has the potential to enhance the 3DP process by providing greater part resolution and surface finish. The 3DP slurry-based process and vector, drop-on-demand printing are examined as potential methods to produce Tungsten Carbide-Cobalt (WC-Co) tooling inserts. The research focuses on three fundamental process steps: (1) development of a stable slurry...

Rehardenable materials system with diffusion barrier for three-dimensional printing

Yuen, Cheong Wing, 1972-
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 197 p.; 16760335 bytes; 16760143 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
Three-Dimensional Printing (3DP) is a solid freeform fabrication process being developed for the direct manufacture of functional tooling and prototypes from a computer solid model. One of its many important applications is the fabrication of metal tooling for plastic injection molding. In order to achieve a fully dense 3DP metal tool, the sintered powder skeleton is infiltrated with a molten alloy, which has a melting point lower than the skeleton material. However, the choices of materials systems are limited by the interactions of the metal powders and infiltrants during the infiltration process. Currently, the materials system with the best wear resistance for 3DP metal tooling consists of 420 stainless steel powder and bronze infiltrant. However, it only has an overall hardness of 25 HRC because the bronze infiltrant is soft and not hardenable. A hardenable 3DP metal system is desirable. The main goals of this thesis research are: 1) to improve the flexibility of choice of metal powders and infiltrants by using a diffusion barrier to isolate them; and 2) to demonstrate the diffusion-barrier approach with steel and hardenable copper-alloy infiltrant. The model materials systems in this study consist of stainless steel and tool steel powder skeletons with Cu-20Ni-20Mn infiltrant. It was demonstrated that TiN coating deposited on steel substrates by CVD successfully prevented the reaction between the steel and molten Cu-20Ni-20Mn at 1200° C. In general...

Design and analysis of artifact-resistive finger photoplethysmographic sensors for vital sign monitoring

Rhee, Sokwoo
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 101 leaves; 6681174 bytes; 6686521 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
A miniaturized, telemetric, photoplethysmograph sensor for long-term, continuous monitoring is presented in this thesis. The sensor, called a "ring sensor," is attached to a finger base for monitoring beat-to-beat pulsation, and the data is sent to a host computer via a RF transmitter. Two major design issues are addressed: one is to minimize motion artifact and the other is to minimize the consumption of battery power. An efficient double ring design is developed to lower the influence of external force, acceleration, and ambient light, and to hold the sensor gently and securely on the skin, so that the circulation at the finger may not be obstructed. To better understand the mechanism of motion artifact by external forces, a comprehensive mathematical model describing the finger photoplethysmography was developed and verified by finite element method, numerical simulation and experiments. Total power consumption is analyzed in relation to the characteristics of the individual components, sampling rate, and CPU clock speed. Optimal operating conditions are obtained for minimizing the power budget. A prototype ring sensor is designed and built based on the power budget analysis and the artifact-resistive attachment method.; (cont.) It is verified through experiments that the ring sensor is resistant to interfering forces and acceleration acting on the ring body. It is also shown that the device meets diverse and conflicting requirements...

Crystallographically consistent percolation theory for grain boundary networks

Frary, Megan
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 179 p.; 10684091 bytes; 10691600 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
Grain boundaries are known to play a role in many important material properties including creep resistance, ductility and cracking resistance. Although the structure and properties of individual boundaries are important, the overall behavior of the material is determined largely by the connectivity of grain boundaries in the microstructure. Grain boundary networks may be studied in the framework of percolation theory by classifying boundaries as special or general to the property of interest. In standard percolation theory, boundaries are randomly assigned as special or general; however, this approach is invalid in realistic grain boundary networks due to the requirement for crystallographic consistency around any closed circuit in the microstructure. The goal of this work is to understand the effects of these local constraints on the connectivity and percolation behavior of crystallographically consistent grain boundary networks. Using computer simulations and analytical models, the behavior of crystallographically consistent networks is compared to that of randomly-assembled networks at several different length scales. At the most local level, triple junctions and quadruple nodes are found to be preferentially coordinated by special and general boundaries...

Experimental demonstration and exploration of quantum lattice gas algorithms

Chen, Zhiying, Ph. D. Massachusetts Institute of Technology
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 112 p.; 4521966 bytes; 4526603 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
Recently, it has been suggested that an array of small quantum information processors sharing classical information can be used to solve selected computational problems, referred to as a type-II quantum computer. The first concrete implementation demonstrated here solves the diffusion equation, and it provides a test example from which to probe the strengths and limitations of this new computation paradigm. The NMR experiment consists of encoding a mass density onto an array of 16 two-qubit quantum information processors and then following the computation through 7 time steps of the algorithm. The results show a good agreement with the analytic solution for diffusive dynamics. From the numerical simulations of the NMR implementations, we explore two major error sources (1) the systematic error in the collision operator and (2) the linear approximation in the initialization. Since the mass density evolving under the Burgers equation develops sharp features over time, this is a stronger test of liquid state NMR implementations of type-II quantum computers than the previous example using the diffusion equation. Small systematic errors in the collision operator accumulate and swamp all other errors. We propose, and demonstrate, that the accumulation of this error can be avoided to a large extent by replacing the single collision operator with a set of operators...

The ins and outs of keeping US service jobs at work

Gorney, Eric D
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 94 p.; 3236596 bytes; 3240472 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
The purpose of this research is to discuss employment in the United States (US) service sector. The main concern is not pinpointing numerical estimates, but instead identifying trends which lead to job growth or job loss. Like manufacturing jobs that have been lost to offshore locations or productivity gains, so too are service jobs at risk. Offshoring - the outsourcing of business functions overseas - and automation have the same effect of displacing workers. What keeps a service job in the US and what makes it ideal to ship overseas or replace with a computer? Consumers have several choices between different product and service offerings. And, different products need varied levels of aftermarket service. What makes customers go out and spend money rather than completing tasks themselves? This thesis attacks these questions by outlining characteristics of products, services, and consumers which could help label jobs as "safe" or "at-risk." First is a discussion of these characteristics. Then, the range of product and service alternatives that consumers have to choose from is presented and applied to examples.; (cont.) Overall, jobs which may be at-risk are those occupations that can be offshored, automated, or easily performed by consumers themselves. On the other hand...

Modular robots for making and climbing 3-D trusses

Yoon, Yeoreum
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 143 p.; 7189696 bytes; 7195667 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
36.947598%
A truss climbing robot has been extensively investigated because of its wide range of promising applications such as construction and inspection of truss structures. It is designed to have degrees of freedom to move in three-dimensional truss structures. Although many degrees of freedom allow the robot to reach various position and orientation, it causes complexity of design and control. In this thesis, the concept of modular robots is suggested as a solution to reconcile a trade-off between the functionality and the simplicity of a truss climbing robot. A single module has fewer degrees of freedom than required to achieve full 3-D motion, but it can move freely in a 2-D plane. For full 3-D motion, multiple modules connect to and cooperate with each other. Thus, modular truss climbing robots can have both properties: functionality and simplicity. A modular truss climbing robot, called Shady3D, is presented as the hardware implementation of this concept. This robot has three motive degrees of freedom, and can form a six-degree-of-freedom structure by connecting to another identical module using a passive bar as a medium. Algorithms to move the robot in a 3-D truss structure have been developed and tested in hardware experiments.; (cont.) The cooperation capability of two modules is also demonstrated. As a next step beyond truss climbing robots...

Structural Health Monitoring using Index Based Reasoning for Unmanned Aerial Vehicles

Li, Ming
Fonte: FIU Digital Commons Publicador: FIU Digital Commons
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
36.94791%
Unmanned Aerial Vehicles (UAVs) may develop cracks, erosion, delamination or other damages due to aging, fatigue or extreme loads. Identifying these damages is critical for the safe and reliable operation of the systems. ^ Structural Health Monitoring (SHM) is capable of determining the conditions of systems automatically and continually through processing and interpreting the data collected from a network of sensors embedded into the systems. With the desired awareness of the systems’ health conditions, SHM can greatly reduce operational cost and speed up maintenance processes. ^ The purpose of this study is to develop an effective, low-cost, flexible and fault tolerant structural health monitoring system. The proposed Index Based Reasoning (IBR) system started as a simple look-up-table based diagnostic system. Later, Fast Fourier Transformation analysis and neural network diagnosis with self-learning capabilities were added. The current version is capable of classifying different health conditions with the learned characteristic patterns, after training with the sensory data acquired from the operating system under different status. ^ The proposed IBR systems are hierarchy and distributed networks deployed into systems to monitor their health conditions. Each IBR node processes the sensory data to extract the features of the signal. Classifying tools are then used to evaluate the local conditions with health index (HI) values. The HI values will be carried to other IBR nodes in the next level of the structured network. The overall health condition of the system can be obtained by evaluating all the local health conditions. ^ The performance of IBR systems has been evaluated by both simulation and experimental studies. The IBR system has been proven successful on simulated cases of a turbojet engine...

Computational algebraic attacks on the Advanced Encryption Standard (AES)

Mantzouris, Panteleimon
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Tese de Doutorado Formato: xvi, 103 p. : col. ill.
Português
Relevância na Pesquisa
36.94791%
This thesis examines the vulnerability of the Advanced Encryption Standard (AES) to algebraic attacks. It will explore how strong the Rijndael algorithm must be in order to secure important federal information. There are several algebraic methods of attack that can be used to break a specific cipher, such as Buchburger's and Faugere's F4 and F5 methods. The method to be used and evaluated in this thesis is the Multiple Right Hand Sides (MRHS) Linear Equations. MRHS is a new method that allows computations to be more efficient and the equations to be more compact in comparison with the previously referred methods. Because of the high complexity of the Rijndael algorithm, the purpose of this thesis is to investigate the results of an MRHS attack in a small-scale variant of the AES, since it is impossible to break the actual algorithm by using only the existent knowledge. Instead of the original ten rounds of AES algorithm, variants of up to four rounds were used. Simple examples of deciphering some ciphertexts are presented for different variants of the AES, and the new attack method of MRHS linear equations is compared with the other older methods. This method is more effective timewise than the other older methods, but, in some cases...

Empirical Studies of Code Clone Genealogies

BARBOUR, LILIANE JEANNE
Fonte: Quens University Publicador: Quens University
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
36.94791%
Two identical or similar code fragments form a clone pair. Previous studies have identified cloning as a risky practice. Therefore, a developer needs to be aware of any clone pairs so as to properly propagate any changes between clones. A clone pair experiences many changes during the creation and maintenance of software systems. A change can either maintain or remove the similarity between clones in a clone pair. If a change maintains the similarity between clones, the clone pair is left in a consistent state. However, if a change makes the clones no longer similar, the clone pair is left in an inconsistent state. The set of states and changes experienced by clone pairs over time form an evolution history known as a clone genealogy. In this thesis, we provide a formal definition of clone genealogies, and perform two case studies to examine clone genealogies. In the first study, we examine clone genealogies to identify fault-prone “patterns” of states and changes. We also build prediction models using clone metrics from one snapshot and compare them to models that include historical evolutionary information about code clones. We examine three long-lived software systems and identify clones using Simian and CCFinder clone detection tools. The results show that there is a relationship between the size of the clone and the time interval between changes and fault-proneness of a clone pair. Additionally...

Analyzing a practitioner perspective on relevance of published empirical research in Requirements Engineering

Tax, Niek
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 05/03/2014 Português
Relevância na Pesquisa
36.94791%
Background: Relevance to industry and scientific rigor have long been an area of friction in IS research. However little work has been done on how to evaluate IS research relevance. Kitchenham et al [13] proposed one of the few relevance evaluating instruments in literature, later revised by Daneva et al [7]. Aim: To analyze the practitioner/consultant perspective checklist1 for relevance in order to evaluate its comprehensibility and applicability from the point of view of the practitioner/consultant in the context of an advanced university classroom. Method: Five master level students in the field of IS assessed a set of 24 papers using the relevance checklist1. For each question in the checklist, inter-rater agreement has been calculated and the reasoning that the practitioners applied has been reconstructed from comments. Results: Inter-rater agreement only showed to be slight for three questions and poor for all other questions. Analysis of comments provided by the practitioners showed only two questions that were interpreted in the same way by all practitioners. These two questions showed significantly higher inter-rater agreement than other questions. Conclusions: The generally low inter-rater agreement could be explained as an indication that the checklist1 is in its current form not appropriate for measuring industry relevance of IS research. The different interpretations found for the checklist questions provide useful insight for reformulation of questions. Reformulations are proposed for some questions.; Comment: 9 pages

Enhancing Human Aspect of Software Engineering using Bayesian Classifier

Gupta, Sangita; V, Suma
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 11/02/2014 Português
Relevância na Pesquisa
36.94791%
IT industries in current scenario have to struggle effectively in terms of cost, quality, service or innovation for their subsistence in the global market. Due to the swift transformation of technology, software industries owe to manage a large set of data having precious information hidden. Data mining technique enables one to effectively cope with this hidden information where it can be applied to code optimization, fault prediction and other domains which modulates the success nature of software projects. Additionally, the efficiency of the product developed further depends upon the quality of the project personnel. The position of the paper therefore is to explore potentials of project personnel in terms of their competency and skill set and its influence on quality of project. The above mentioned objective is accomplished using a Bayesian classifier in order to capture the pattern of human performance. By this means, the hidden and valuable knowledge discovered in the related databases will be summarized in the statistical structure. This mode of predictive study enables the project managers to reduce the failure ratio to a significant level and improve the performance of the project using the right choice of project personnel.; Comment: 5 Pages...

Server selection for mobile agent migration

Caro, Wayne
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
36.953413%
The purpose of this thesis is to develop, test, and simulate an algorithm that mobile software agents can use to select a server to which the agents can migrate. Software agents are autonomous software entities that perform tasks on behalf of other agents or humans, and that have some degree of intelligence. In particular, a mobile software agent is capable of migrating from one computer system (agent server) to another during the course of performing its tasks. Most current implementations of mobile software agents (simply referred to as agents) have simple forms of server selection. The algorithm discussed in this thesis proposes new ideas for dealing with the server selection process. The algorithm proposed in this thesis is intended to provide a good basis from which further work can be continued in the area of agent server selection. This algorithm was demonstrated to work as expected under a set of boundary conditions of purely abstract computer resources. Then the algorithm was used in a simulation of a print job scheduler for a cluster of printers. Some of the concepts that this algorithm uses are resource importance factors, "needed" and "wanted" resources, risk factors, server resource evaluations, and server resource availability.

B-spline surface techniques for solid modeling an application to computer-aided geometric design

Tang, Chi-Ming
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
36.953413%
One important area of Computer-Aided Geometric Design (CAGD) is concerned with the approximation and representation of the surfaces of solid objects. Accurately describing the shape of an object so that the description is useful to designers who must decide how to manipulate it is an important problem. B-spline techniques promise greater versatility in describing complex surfaces than other techniques, thus the B-spline surface is highlighted in the field of constructive solid geometric modeling. A method for drawing complex surfaces by using B-spline techniques is presented. The tensor product surface scheme is developed for constructing sculptured surfaces. Also, the basic principle of multivariate B-splines, i.e., nontensor product surfaces, the light of tomorrow in CAGD, is introduced.

Game theory MANET routing for jamming environment

Zhu, Yi
Fonte: University of Delaware Publicador: University of Delaware
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
36.953413%
Bohacek, Stephan K.; A Mobile Ad Hoc Network (MANET) is a set of self-organizing wireless mobile nodes, which communicate with each other without any existing network infrastructure or centralized network management. Wireless communication over MANET follows a multi-hop manner, in which each node can be a transceiver or a router. MANET has spurred considerable research interests and applications in the felids of vehicle network, sensor network and tactical communication network. Tactical communication network works in a highly dynamic environment with the kinds of interferences and jammings, which can cause low packet delivery rates, long delays or even interruptions. Moreover, the effects may spread across multiple network protocol stacks, typically physical (PHY) layer and media access control (MAC) layer. Because of the interruptions and jammings, MANET routing algorithms must be robust enough to give reliable network services. Furthermore, since the end-to-end communications of MANET are achieved via multi-hop relays, the routing protocol is an essential factor that affects the overall performance of MANET. The Optimized Link State Routing (OLSR) is a routing protocol for MANET, which is built on the classical link state algorithm with multi-point relays (MPRs). OLSR has been shown to be suitable for large-scale dense wireless networks. In this thesis...

A Search Optimization in FFTW

Gu, Liang
Fonte: University of Delaware Publicador: University of Delaware
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
36.953413%
Li, Xiaoming; Generating high performance fast Fourier transform(FFT) libraries for different computer architectures is an important task. Architecture vendors sometimes have to rely on dedicated experts to tune FFT implementation on each new platform. Fastest Fourier transform in the West(FFTW) replaces this tedious and repeated work with an adaptive FFT library. It automatically generates FFT code that are comparable to libraries provided by vendors. Part of its success is due to its highly e cient straight-line style code for small DFTs, called codelets. The other part of its success is the result of a large and carefully chosen search space of FFT algorithms. FFTW mainly traverses this space by empirical search, otherwise a simple heuristic is used. However, both methods have their downside. The empirical search method spends a lot of search time on large DFT problems and the simple heuristic often delivers implementation that is much worse than optimum. An ideal approach should nd a reasonably good implementation within the FFT search space in a small amount of time. Model-driven optimization is often believed to be inferior to empirical search. It is very hard to capture all the performance features of an adaptive library on many modern architectures. No one has implemented an adaptive performance model to automatically assist the search of FFT algorithms on multiple architectures. This thesis presents an implicit abstract machine model and a codelet performance model that can be used in the current FFTW framework. With the performance prediction given by these models...

Avaliação da eficiência energética de edificações residenciais em fase de projeto : análise de desempenho térmico pelo método prescritivo e por simulação computacional aplicados a estudo de caso de projeto-tipo do Exército Brasileiro; Energy efficiency assessment of residential buildings in design phase : an analysis of thermal performance using prescriptive and computer simulation methods applied to a case study of a Brazilian Army's standard design

Marcus Vinicius de Paiva Rodrigues
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 26/08/2015 Português
Relevância na Pesquisa
36.953413%
O presente estudo realiza a avaliação da eficiência energética de edificação residencial em fase de projeto, por meio da análise de conceitos de arquitetura bioclimática e desempenho térmico, segundo os métodos prescritivo e de simulação computacional. A avaliação pelo método prescritivo ampara-se nas normas brasileiras de desempenho térmico e regulamentos técnicos de eficiência energética de edificações do Programa Nacional de Conservação de Energia Elétrica ¿ Subprograma Edificações (PROCEL-Edifica), além dos códigos de obras do Exército Brasileiro, instituição cujo projeto-tipo de edifício residencial foi escolhido para o estudo de caso. A simulação computacional tem o Energy Plus como software de referência apesar de se valer de outros como Ecotect e Climate Consultant na medida das necessidades da pesquisa. Avalia-se o desempenho termoenergético da edificação a partir dos dados climáticos atuais do local de implantação, a cidade do Rio de Janeiro, e caracterizam-se as condições de um cenário climático futuro para o ano de 2020, por meio da extrapolação de dados permitida pelo software CCWorldWeatherGen. Pretende-se que a metodologia empregada e os resultados sejam aproveitados: i) no aprimoramento do projeto do estudo de caso...