• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 167
  • 47
  • 33
  • 10
  • 7
  • 6
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 327
  • 89
  • 84
  • 66
  • 66
  • 65
  • 60
  • 58
  • 57
  • 54
  • 50
  • 33
  • 31
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Liquidity Risk Situation Of Turkish Insurance Industry And Firm Specific Factors Affecting Liquidity

Balkanli, Aysegul 01 April 2010 (has links) (PDF)
Recent changes in the insurance regulations and laws in Turkey lead insurance industry to gain further importance and turn the insurance industry one of the rising sectors in financial markets. However, while these favorable events for the insurance industry takes place in Turkish markets, on the global markets the economic crisis initiated with the collapse of US sub-prime mortgage markets deepened and the credit crunch arose in the aftermath. In the times of credit stress having good liquidity base is important for the firms. In recent economic crisis, liquidity related troubles resulted in bailouts or takeovers of giant financial companies. In order to prevent negative consequences of inadequate liquidity and to sustain financial stability, having appropriate level of liquidity is especially crucial for financial companies like insurers. In this thesis it is aimed to analyze the Turkish insurance sector&rsquo / s liquidity condition for the period between 2002 and 2008 with the help of liquidity ratios. Considering the nature of the business, a distinction is made between &ldquo / non-life&rdquo / and &ldquo / life&rdquo / insurance companies while assessing their liquidity ratios. Furthermore, panel data regression analysis is conducted to determine the firm specific factors affecting the liquidity decisions of non-life insurers operating in Turkey.
102

Mass Customizing The Relations Of Design Constraints For Designer-built Computational Models

Ercan, Selen 01 September 2010 (has links) (PDF)
The starting motivation of this study is to develop an intuitively strong approach to addressing architectural design problems through computational models. Within the scope of the thesis, the complexity of an architectural design problem is modeled computationally by translating the design reasoning into parameters, constraints and the relations between these. Such a model can easily become deterministic and defy its purpose, if it is customized with pre-defined and unchangeable relations between the constraints. This study acknowledges that the relations between design constraints are bound to change in architectural design problems, as exemplified in the graduation project of the author. As such, any computational design model should enable designers to modify the relations between constraints. The model should be open for modifications by the designer. v The findings of the research and the architectural design experiments in the showcase project suggest that this is possible if mass customized sequences of abstract, modifiable and reusable relations link the design constraints with each other in the model. Within the scope of this thesis, the designer actions are mass-customized sequences of relations that may be modified to fit the small design tasks of relating specific design constraints. They relate the constraints in sequence, and are mass customized in an abstract, modifiable and reusable manner. Within this study, they are encoded in Rhino Grasshopper definitions. As these mass customized relations are modifiable, they are seen as a remedy for enabling the designers to build models that meet individual and intuitive needs of the design problems that designers define.
103

Psychological Problems Of Prisoners On The Bases Of Their Upon-release Future Expectations And Personality Characteristics: The Importance Of Being Parent And Time Left Before Release

Karaca, Ozlem 01 December 2010 (has links) (PDF)
The main purpose of the study was to obtain an estimate of Upon-Release Future Expectations of prisoners, and to examine the associations between these expectations and prisoners&rsquo / psychological problems. In addition, the effect of being parent, and of time left before release on Upon-Release Future Expectations of prisoners and their psychological problems were aimed to be examined. For these purposes, firstly, Upon-Release Future Expectations Scale was developed, and its reliability was investigated. Positive-Negative Affect Scale, Beck Depression Scale, the Trait Form of State-Trait Anxiety Inventory, and Hopelessness Scale were used to test its criterion-related validity. Then, in order to reveal the associations between the variables, two sets of regression analyses were conducted. In the first regression analysis, age, gender, time left before release, parental status (i.e., being a parent or not), and scores of Rosenbaum&rsquo / s Learned Resourcefulness Scale and Basic Personality Traits Inventory were used as independent variables, and revealed factors of Upon-Release Future Expectations (i.e., Future Conditions, Perceived Risks, and Confidence in Coping) were entered as dependent variables. In the second regression analysis, besides the independent variables of the first analysis, factors of Upon-Release Future Expectations were used as independent variables, and depression, trait anxiety, and hopelessness scores were entered as dependent variables. The results did not reveal a main effect for time left before release and parental status. Both the significant associations and the insignificant associations between the dependent and the independent variables were discussed. The study was conducted with 96 female and 84 male prisoners.
104

A Study Of Brightwater Injection Efficiency On Sector Model Using Stars Software

Pashayev, Nariman 01 September 2011 (has links) (PDF)
Maintaining proper waterflood conformance is a critical component of waterflood management. Most methods used to control waterflood conformance have proven to be only marginally effective. A unique technique has been developed for creating a durable reservoir flow restriction that diverts injected water into unswept reservoir sections. Placement of the restriction is based in the location of the thermal front between the injector and producers. A thermally activated nano-sized particle system-BRIGHTWATER - was developed that gives us this restriction. A sector model of ACG field has been developed to study applicability of BRIGHTWATER injection in ACG field. A decrease in oil production and increase in water production were seen in wells after production started. The water cuts were high for South flank wells. From the simulation it was seen that there were unswept zones. So this new technology was decided to apply in this thesis work. Several runs were conducted to study effect of BRIGHTWATER concentration, crosslinker concentration, injection rate and pressure, injection temperature, injection times and injection well locations. Results are given in tables and figures and briefly discussed. Also the best and the worst cases are chosen from the results, and analyzed in detail. Finally, economical analysis is given. It has been observed that injecting the polymer in slug form is better than continuous injection. Injecting polymer in early times may give better results. Injection of polymer with 3 slug sizes between 6 month injection periods seems more beneficial. According to the simulation results optimum polymer injection temperature was 780 F. Good results were obtained when polymer was injected at 65000, 75000 and 85000 bbl/day injection rates. Oil recoveries obtained during simulation were in the range of 1.4% to 3.8 % which gives additional recovery of 11 to 31 MMSTB of oil. BRIGHTWATER injection has been found to be applicable to ACG field.
105

Controller Design And Simulation For A Helicopter During Target Engagement

Avcioglu, Sevil 01 December 2011 (has links) (PDF)
The aim of this thesis is to design a controller for an unmanned helicopter to perform target engagement. This mission is briefly defined as / the helicopter flies to a firing point under the commands of a trajectory controller, and then it is aligned to the target with attitude control. After weapon firing, the helicopter initiates a return maneuver under again the commands of the trajectory controller. This mission where the continuous systems and discrete guidance decisions are to be executed in coherence can be studied as a hybrid control problem. One hybrid control approach which is used in this study is the representation based on two motion primitives: trim trajectories and maneuvers. To obtain the desired trim trajectories and the maneuvers, a dynamic inversion based controller is developed. The controller has two loops: the inner loop which controls the helicopter attitudes and the outer loop which controls the helicopter trajectory. A guidance algorithm is developed which enables the controller to switch from the inner loop to the outer loop or vice versa. Simulations are generated to test the controller performance.
106

Prototype Development And Verification For An Ip Lookup Engine On Fpgas Performance Study

Ozkaner, Akin 01 February 2012 (has links) (PDF)
The increasing use of the internet demands more powerful routers with higher speed, less power consumption and less physical space occupation. IP lookup operation is one of the major concerns in today&rsquo / s routers for providing such attributes. To accomplish IP lookup on routers, hardware or software based solutions can be used. In this thesis, an SRAM based pipelined architecture proposed earlier for ASIC implementation is re-designed and implemented on an FPGA in the form of a BRAM based pipelined 8x8 torus architecture using Xilinx ISE and simulated and verified using Modelsim Simulator. Some necessary modifications and improvements for FPGA implementation are carried out. The results of our experiments, which are performed for a real router lookup table and a real time traffic load with various optimizations, are also presented. Our study and design effort demonstrates the feasibility of the FPGA implementation of the proposed technique, of course with a considerable performance penalty.
107

Naturalismo e naturalismos na pintura portuguesa do séc. XX e a Sociedade Nacional de Belas-Artes

Tavares, Cristina Azevedo, 1956- January 1999 (has links)
No description available.
108

Automated Quantification of Biological Microstructures Using Unbiased Stereology

Bonam, Om Pavithra 01 January 2011 (has links)
Research in many fields of life and biomedical sciences depends on the microscopic image analysis of biological images. Quantitative analysis of these images is often time-consuming, tedious, and may be prone to subjective bias from the observer and inter /intra observer variations. Systems for automatic analysis developed in the past decade determine various parameters associated with biological tissue, such as the number of cells, object volume and length of fibers to avoid problems with manual collection of microscopic data. Specifically, automatic analysis of biological microstructures using unbiased stereology, a set of approaches designed to avoid all known sources of systematic error, plays a large and growing role in bioscience research. Our aim is to develop an algorithm that automates and increases the throughput of a commercially available, computerized stereology device (Stereologer, Stereology Resource Center, Chester, MD). The current method for estimation of first and second order parameters of biological microstructures requires a trained user to manually select biological objects of interest (cells, fibers etc.) while systematically stepping through the three dimensional volume of a stained tissue section. The present research proposes a three-part method to automate the above process: detect the objects, connect the objects through a z-stack of images (images at varying focal planes) to form a 3D object and finally count the 3D objects. The first step involves detection of objects through learned thresholding or automatic thresholding. Learned thresholding identifies the objects of interest by training on images to obtain the threshold range for objects of interest. Automatic thresholding is performed on gray level images converted from RGB (red-green-blue) microscopic images to detect the objects of interest. Both learned and automatic thresholding are followed by iterative thresholding to separate objects that are close to each other. The second step, linking objects through a z-stack of images involves labeling the objects of interest using connected component analysis and then connecting these labeled objects across the stack of images to produce a 3D object. Finally, the number of linked objects in a 3D volume is counted using the counting rules of stereology. This automatic approach achieves an overall object detection rate of 74%. Thus, these results support the view that automatic image analysis combined with unbiased sampling as well as assumption and model-free geometric probes, provides accurate and efficient quantification of biological objects.
109

Antecedents and Consequences of Channel Alienation: An Empirical Investigation within Franchised Channels of Distribution

Lapuka, Ivan 31 December 2010 (has links)
Investigating an important overlooked phase of interorganizational relationship evolution, which is currently hypothesized to progress through five stages of awareness, exploration, expansion, commitment, and dissolution, this dissertation proposes that in the long road between commitment and dissolution, the quintessential interfirm relationship is likely to be characterized by aprolonged period of relationship alienation, which then becomes the immediate precursor to the dissolution stage. The dissertation utilizes social learning theory, behavior constraint theory, and alienation theory to explain apathetic behaviors of franchisees. The principal proposition is that certain characteristics of the franchise system’s operating environment inadvertently condition franchisee estrangement and failure, and the maladaptive behaviors persist even after environmental changes make success possible again. The dissertation proposes and empirically tests a conceptual model of franchisee alienation. Data from dyadic franchisee-franchisor relationships (N=185) across a wide variety of industries were obtained through a survey of franchisee organizations that were members of the Franchise Council of Australia (FCA). The results render support to the central hypothesis that franchisee alienation occurs as a result of the franchisee organization disconnecting its own actions from the outcomes of its interactions with the franchisor. Franchisee alienation is shown as a phenomenon that is extremely toxic for the franchise system as a whole, as the alienated franchisee is likely to engage in opportunistic behaviors, exhibits reduced productivity, and is inclined to litigate against the franchisor and to dissolve its relationship with the franchisor. On the basis of the findings, the dissertation offers a prescription in terms of the different strategies that can be used by the franchisor to prevent and combat franchisee alienation.
110

Semisupervised sentiment analysis of tweets based on noisy emoticon labels

Speriosu, Michael Adrian 02 February 2012 (has links)
There is high demand for computational tools that can automatically label tweets (Twitter messages) as having positive or negative sentiment, but great effort and expense would be required to build a large enough hand-labeled training corpus on which to apply standard machine learning techniques. Going beyond current keyword-based heuristic techniques, this paper uses emoticons (e.g. ':)' and ':(') to collect a large training set with noisy labels using little human intervention and trains a Maximum Entropy classifier on that training set. Results on two hand-labeled test corpora are compared to various baselines and a keyword-based heuristic approach, with the machine learned classifier significantly outperforming both. / text

Page generated in 0.0636 seconds