With technology growing at an alarming rate the need for precise measurement equipment has also intensified. One such device is the SICK Laser Measurement System (LMS). Based on a time-of-flight measurement principle, this sensor allows for centimeter resolution. This thesis presents a basic understanding and characterization of the laser measurement system necessary for successful implementation into an experimental environment. A personal computer to LMS software interface is also presented. The SICK Company provides software for acquisition which displays a scanned profile in real time, but it does not have the provision for saving the acquired image and corresponding data, nor is it suited for on demand modifications such as actively setting the operation mode of the LMS. Having software to handle the communication between the LMS and PC that can be seamlessly integrated into an experimental environment is essential. The laser scanner then becomes an onboard sensor with the capability of sourcing real time data for use in pose estimation or an online control algorithm, among other possibilities.; (cont.) To further expand the profitability of the range sensor in the incorporation of experiments, a MATLAB GUI that is able to dynamically localize the LMS with respect to a given coordinate system was created. Self localization of the laser scanner allows for more complex experimental setups without the need of cumbersome and often inaccurate human based measurements.; by Marcos Berrios.; Thesis (S.B.)--Massachusetts Institute of Technology...
The medical field, and surgeons in particular, are turning to engineers to develop systems that help them learn their craft better. Mannequin-based systems, animal labs and surgery on cadavers each have drawbacks that could be addressed through realistic computer-based surgical simulation systems. To generate a simulation that includes both tactile/haptic and visual feedback, one must know what the material properties of tissue are, so that a finite element or other model can generate the proper predictions for interactions between surgical instruments and tissue. This thesis presents the design, construction, characterization, and use of a mini- mally invasive surgical instrument designed to measure the linear visco-elastic prop- erties of solid organs. The Tissue Material Property Sampling Tool, or TeMPeST 1-D, applies a small amplitude vibration normal to the surface of an organ such as liver or spleen, and records the applied force and displacement. It has a range of motion of up to lmm, and can apply up to 300mN force with a 5mm right circular indenter. The open loop bandwidth of the system is approximately 100Hz, which is greater than the bandwidth of both the human visual and motor control systems. The relationships between indentation force and displacement and material prop- erties such as the elastic modulus of tissue are presented...
The Decision Aids for Tunneling (DAT) are a computer based method with which distributions of tunnel construction time and cost as well as required and produced resources can be estimated considering uncertainties in geologic conditions, construction processes and resources. The results of the DAT in turn can be used for various decision making processes. Although the DAT included a resource management model, this was not at the same level as the other parts of the method. Hence, the main objectives of this research are to define the requirements for an adequate and comprehensive resource model for tunneling and develop a new resource model satisfying these new requirements. There are three major developments and contributions of the new resource models: 1. In order to have complete and accurate cost and time estimation, the new resource model can explicitly estimate the cost and time based on the actual amount of resources used for and produced from tunnel construction. 2. Resource scheduling and planning features have been implemented into the new resource model. This allows one to determine the optimal tunneling plan, which takes into account the technical precedence of the tunneling activities, the resource/space availability, the dynamic status of the process...
The supercritical carbon dioxide (S-CO₂) cycle is a promising advanced power conversion cycle which couples nicely to many Generation IV nuclear reactors. This work investigates the power conversion system design and proposes several "Third Generation" plant layouts for power ratings ranging between 20 and 1200 MWe for the recompression cycle. A 20 MWe simple cycle layout was also developed. The cycle designs are characterized by a dispersed component layout in which a single shaft turbomachinery train is coupled to parallel arrays of multiple printed circuit heat exchanger modules. This configuration has arrangement benefits in terms of modularity, inspectability, repairability and replaceability. Compared to the prior second generation dispersed layouts, its lower ductwork pressure drop confers approximately 2% higher thermal efficiency. Two alternative S-CO₂ cycle designs for medium power applications were developed using an in-house optimization computer code and Solid Edge software. The first design is a recompression cycle derived from the 300 MWe design developed at MIT for Generation IV reactors. The design employs one turbine, two compressors (main and recompression) working in parallel and two recuperators (high and low temperature) and maximizes cycle efficiency while striving for a small plant footprint. The second design is a simple S-CO₂ power cycle...
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. ^ A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and nonepileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. ^ The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection...
This study proposes a new framework that can effectively apply unit testing to concurrent programs, which are difficult to develop and debug. Test-driven development, a practice enabling developers to detect bugs early by incorporating unit testing into the development process, has become wide-spread, but it has only been effective for programs with a single thread of control. The order of operations in different threads is essentially non-deterministic, making it more complicated to reason about program properties in concurrent programs than in single-threaded programs. Because hardware, operating systems, and compiler optimizations influence the order in which operations in different threads are executed, debugging is problematic since a problem often cannot be reproduced on other machines. Multi-core processors, which have replaced older single-core designs, have exacerbated these problems because they demand the use of concurrency if programs are to benefit from new processors. The existing tools for unit testing programs are either flawed or too costly. JUnit , for instance, assumes that programs are single-threaded and therefore does not work for concurrent programs; ConTest and rstest predate the revised Java memory model and make incorrect assumptions about the operations that affect synchronization. Approaches such as model checking or comprehensive schedule-based execution are too costly to be used frequently. All of these problems prevent software developers from adopting the current tools on a large scale. The proposed framework (i) improves JUnit to recognize errors in all threads...
peer-reviewed; Variability management (VM) is a fundamental activity of software product line engineering (SPLE). VM explicitly represents software artifact variations for managing dependencies among SPL variants and support their instantiations throughout the SPL life cycle. It involves complex and challenging tasks, which must be supported by effective methods, techniques, and tools. Researchers have studied these challenges and proposed solutions to them for nearly 20 years. This article reports results from a study to systematically review the research and synthesize the evidence regarding the effectiveness of proposed solutions. One Web extra offers a systematic literature review of a study in which the authors assessed 97 papers that either claimed or provided some kind of evaluation of a variability management approach, technique, or tool. The other Web extra is an erratum to this article.
The Unified Modelling Language (UML) is intended to express complex ideas in an intuitive and easily understood way. It is important because it is widely used in software engineering and other disciplines. Although an official definition document exists, there is much debate over the precise meaning of UML models. ¶ In response, the academic community have put forward many different proposals for formalising UML, but it is not at all obvious how to decide between them. Indeed, given that UML practitioners are inclined to reject formalisms as non-intuitive, it is not even obvious that the definition should be “formal” at all. Rather than searching for yet another formalisation of UML, our main aim is to determine what would constitute a good definition of UML. ¶ The first chapter sets the UML definition problem in a broad context, relating it to work in logic and the philosophy of science. ...; yes
Current projections show that U.S. international trade is expected to reach nearly two billion tons by 2020, approximately double today's level. With such a large forecasted growth in trade coming through the United States and growing problems associated with highway congestion, air pollution, and national security, building short sea shipping networks will be difficult, but possible, and potentially of great benefit to the nation. By bringing together shipping providers, customers, and with support from the federal government, short sea shipping can become a reality. This paper outlines the need for a change in our maritime transportation system. It takes a look at the current uses of short sea shipping in the United States as well as the system used in Europe. The technology associated with this concept is described and high-speed vessel design is investigated. Issues relating to the integration of short sea shipping are brought to light, including customer requirements, capital financing, and government policy. A computer-based simulation model calculates a total cost analysis for two modes of transporting goods, trucking and short sea shipping. The model is applied to a group of products of different size, weight, and value.; (cont.) The quantitative results of the model show that in most cases...
To examine the effects of using synthetic Fischer-Tropsch (FT) diesel fuel in a modern compression ignition engine, experiments were conducted on a MY 2002 Cummins 5.9 L diesel engine outfitted with high pressure, common rail fuel injection, a variable geometry turbo charger, cooled EGR and a fully configurable engine management computer. Additionally, the effect of varied injection timing and EGR rates were studied to examine how the engine can be optimized for FT fuel. The test fuels included two standard diesel fuels, one with 400 PPM sulfur content and the other 15 PPM sulfur. The experimental fuels were Syntroleum Corporation's S-1 fuel, as well as blends of 25% S-1 with a balance of 15 or 400 PPM D2. Tests were conducted with three engine operating conditions: 1682 RPM, 474 kPa BMEP; 2011 RPM, 1000 kPa BMEP; 2011 RPM, 1400 kPa BMEP. It was found that FT fuel reduced NOx emissions 19% in low load tests, but alone had little effect in higher load tests. FT fuel reduced particulate matter (PM) emissions in almost all test case, on the order of 25 to 75%. Retarding injection timing and increasing EGR both reduce NOx emissions. In the case of standard fuels, these reduction come at the expense of increased PM. However, FT fuel reduced this effect and allows for more retarded timing and further increased EGR rates to control NO. Blended fuels...
Smoothness is characteristic of coordinated human movements, and stroke patients' movements seem to grow more smooth with recovery. A robotic therapy device was used to analyze five different measures of movement smoothness in the hemiparetic arm of thirty-one patients recovering from stroke. Four of the five metrics showed general increases in smoothness for the entire patient population. However according to the fifth metric, the movements of patients with recent stroke grew less smooth over the course of therapy. This pattern was reproduced in a computer simulation of recovery based on submovement blending, suggesting that progressive blending of submovements underlies stroke recovery. Submovements are hypothesized fundamental building blocks of human movement. All available evidence is consistent with their existence and no other theory has been proposed that can fully account for observed phenomena in human movement. However, there is no obvious way to prove their existence. Nevertheless, repeatedly successful decomposition of movement data into submovements may produce sufficient evidence to make the question moot. The component submovements of stroke patients' point-to-point movements were estimated using a novel submovement extraction algorithm. Over the course of therapy...
The MIT Media Laboratory Robotic Life Group's Leonardo is a highly expressive robot used for, among other things, social learning and human-robot teamwork research. A mixed reality workspace was conceived to aid in experimentation and demonstration of human-robot interaction by providing a complex state space and several interaction possibilities. A box concept was selected for its ability to incorporate several interaction mechanisms while allowing for meaningful physical tasks. A first iteration of the system was completed, which was controllable primarily through serial communication with a computer, while providing minimal physical communication. For a second revision of the system, physical interaction devices were developed which could be actuated by either the robot or a human, so as to better explore social interaction. Further development of the project will yield a robust, flexible and expandable tool with which future robot social learning and teamwork research can be performed.; by Javier G. Matamoros.; Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2006.; Includes bibliographical references (p. 25).
A mobile ad hoc network is a self organized cooperative network that works
without any permanent infrastructure. This infrastructure less design makes it
complex compared to other wireless networks. Lot of attacks and misbehavior
obstruct the growth and implementation. The majority of attacks and misbehavior
can be handled by existing protocols. But these protocols reduce the total
strength of nodes in a network because they isolate nodes from network
participation having lesser reputation value. To cope with this problem we have
presented the Possibility and Certainty model. This model uses reputation value
to determine the possibilities and certainties in network participation. The
proposed model classifies nodes into three classes such as certain or HIGH
grade possible or MED grade and not possible or LOW grade. Choosing HIGH grade
nodes in network activities improves the Packet Delivery Ratio which enhances
the throughput of the MANET. On the other hand when node strength is poor we
choose MED grade nodes for network activities. Thus the proposed model allows
communication in the worst scenario with the possibility of success. It
protects a network from misbehavior by isolating LOW grade nodes from routing
paths.; Comment: 10 Pages. International Journal of Computer Engineering and
Earth, water, air, food, shelter and energy are essential factors required
for human being to survive on the planet. Among this energy plays a key role in
our day to day living including giving lighting, cooling and heating of
shelter, preparation of food. Due to this interdependency, energy, specifically
electricity, production and distribution became a high tech industry. Unlike
other industries, the key differentiator of electricity industry is the product
itself. It can be produced but cannot be stored for future; production and
consumption happen almost in near real-time. This particular peculiarity of the
industry is the key driver for Machine Learning and Data Science based
innovations in this industry. There is always a gap between the demand and
supply in the electricity market across the globe. To fill the gap and improve
the service efficiency through providing necessary supply to the market,
commercial as well as federal electricity companies employ forecasting
techniques to predict the future demand and try to meet the demand and provide
curtailment guidelines to optimise the electricity consumption/demand. In this
paper the authors examine the application of Machine Learning algorithms,
specifically Boosted Decision Tree Regression...
The work presented in this paper is related to the area of Situational Method
Engineering (SME) which focuses on project-specific method construction. We
propose a faceted framework to understand and classify issues in system
development SME. The framework identifies four different but complementary
viewpoints. Each view allows us to capture a particular aspect of situational
methods. Inter-relationships between these views show how they influence each
other. In order to study, understand and classify a particular view of SME in
its diversity, we associate a set of facets with each view. As a facet allows
an in-depth description of one specific aspect of SME, the views show the
variety and diversity of these aspects.
Knowledge management (KM) adoption in the supply chain network needs a good
investment as well as few changes in the culture of the entire SC. Knowledge
management is the process of creating, distributing and transferring
information. The goal of this study is to Rank KM criteria in supply chain
network in Iran which is important for firms these days. Criterion used in this
paper were extracted from the literature review and were confirmed by supply
chain experts. The proposed approach for ranking and finding out about these
criterion is hybrid fuzzy DEMATEL-TODIM, with using fuzzy number as data for
our studies we could avoid uncertainty. The data was gathered from PhD. And Ms.
Students in industrial engineering of Kharrazmi university of Tehran and PhD.
And Ms. Students of the management department of Semnan university. A new
hybrid approach was used for achieving the results of this study. This new
hybrid approach ranks data criteria respect to each other, then by using TODIM
for ranking respect to the best situation (gains), the rates of criterion were
determined which is a very important advantage
Using data from a web-based survey of software developers, the author
attempts to determine root causes of "death march" projects and excessive work
hours in the software industry in relation to company practices and management.
Special emphasis is placed on the factor of business/technical supervisor
background. An analysis of variance revealed significant differences between
these supervisor groups with regard to a "Pointy-Haired Boss" (PHB) sentiment
index. This difference, combined with correlations between the PHB index and
the endpoints of project failure and use of software engineering practices,
indicate some disparity in the suitability of businessbackground supervisors to
manage software development projects compared with their technical-background
counterparts. Other survey data points to improved project management skills as
the biggest necessity for supervisors in the business-background group.
Vehicles passengers and other traffic participants are protected more and
more by integral safety systems. They continuously perceive the vehicles
environment to prevent dangerous situations by e.g. emergency braking systems.
Furthermore, increasingly intelligent vehicle functions are still of major
interest in research and development to reduce the risk of accidents. However,
the development and testing of these functions should not rely only on
validations on proving grounds and on long-term test-runs in real traffic;
instead, they should be extended by virtual testing approaches to model
potentially dangerous situations or to re-run specific traffic situations
easily. This article outlines meta-metrics as one of todays challenges for the
software engineering of these cyber-physical systems to provide guidance during
the system development: For example, unstable results of simulation test-runs
over the vehicle functions revision history are elaborated as an indicating
metric where to focus on with real or further virtual test-runs; furthermore,
varying acting time points for the same virtual traffic situation are
indicating problems with the reliability to interpret the specific situation.
In this article, several of such meta-metrics are discussed and assigned both
to different phases during the series development and to different levels of
detailedness of virtual testing approaches.
A Software Engineering project depends significantly on team performance, as
does any activity that involves human interaction. In the last years, the
traditional perspective on software development is changing and agile methods
have received considerable attention. Among other attributes, the ageists claim
that fostering creativity is one of the keys to response to common problems and
challenges of software development today. The development of new software
products requires the generation of novel and useful ideas. It is a conceptual
framework introduced in the Agile Manifesto in 2001. This paper is written in
support of agile practices in terms of significance of teamwork for the success
of software projects. Survey is used as a research method to know the
significance of teamwork.; Comment: 4 pages, 4 figures
Fonte: University of DelawarePublicador: University of Delaware
Tipo: Tese de Doutorado
Relevância na Pesquisa
Winbladh, Kristina; With rapidly changing and growing software, the importance of Requirements Engineering (RE) has emerged as one of the primary interests of industry. There are currently tools to catalyze the RE process by featuring functionalities to sort and manage requirements; however, they fall short with regard to eliciting useful information from customers and other stakeholders. This thesis focuses on
determining the criteria of a novel RE method that overcomes the current limitations of existing RE methods and implementing it as a tool. The novel method is based on a thorough analysis of available user narration tools from similar industries - User eXperience (UX) and Social video gaming. In order to effectively capture the social processes, that often are in place among a set of stakeholders, specific attention is given to narration tools that capture emotional changes and patterns among stakeholders. Finally, we integrate features corresponding to our criteria in the design and implementation of a new and improved RE tool, iMuse – Integrated Model-based Use-case and Storytelling Environment.; University of Delaware, Department of Electrical and Computer Engineering; M.S.