• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 99
  • 30
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 192
  • 192
  • 192
  • 67
  • 50
  • 46
  • 32
  • 30
  • 30
  • 30
  • 27
  • 26
  • 25
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Gestão de projetos de P&D no IPEN: diagnóstico e sugestões ao Escritório de Projetos (PMO) / Project management of R&D in IPEN - Diagnosis and Suggestions to the Project Management Office (PMO)

Hannes, Egon Martins 12 March 2015 (has links)
O presente trabalho pretende entender a dinâmica do gerenciamento de projetos no IPEN. Para tal, decidiu-se pela pesquisa junto a literatura acadêmica de modelos que pudessem servir de base e que após modificações e ajustes pudessem refletir a realidade dos projetos de Institutos Públicos Pesquisa & Desenvolvimento. Após tratamento estatístico dos dados algumas hipóteses foram validadas e demonstraram sua influência positiva no desempenho do gerenciamento do projeto, tais como a influência das pessoas que compõem as equipes, o efeito da liderança, dentre outras. O modelo, inclusive mostrou-se válido para explicar quais fatores são relevantes para o sucesso dos projetos. Um das principais objetivos, foi exatamente o uso de modelo de avaliação de gestão projetos, que fossem passíveis de validação estatística, e não utilizar um dos disponíveis no mercado, tais como P3M3 e OPM3, para que houvesse um controle e confirmação estatística dos resultados. Outro objetivo foi utilizar um modelo cujas assertivas refletissem a natureza dos projetos de Pesquisa & Desenvolvimento gerenciados pelos pesquisadores do IPEN. Aliás, as referidas assertivas foram formuladas, e enviadas via pesquisa web, e respondidas por praticamente uma centena de profissionais do IPEN, envolvidos com projetos de P&D. A presente dissertação, acrescida das recomendações, ao final, tem como proposta servir de contribuição para os trabalhos desenvolvidos pelo Escritório de Projetos do IPEN. O modelo de avaliação, contido neste trabalho, pode ser aplicado em outras Instituições de P&D brasileiras, para que avaliem a forma e a maneira como gerenciam os seus respectivos projetos. / This paper aims to understand the dynamics involved in the project management at IPEN. To reach this goal, the method chosen was research along with academic literature of models that could serve as a base that after modifications and adjustments could reflect the reality of projects from the Public Institute of Research & Development. After undergoing statistical treatment of the data, some hypotheses were validated and showed positive influence on the project management performance, such as the influence of people who make up the teams, the leadership effect, among others. In fact, the model was found to be valid in explaining which factors are relevant for the success of the projects. One of the main goals was exactly the use of the project management evaluation model, submitted to statistical validation and not to use one available on the market, such as the P3M3 and OPM3, in order to assure the statistical control and confirmation of the results. Another goal was to use a model whose statements reflected the nature of the Research & Development project managed by researchers at IPEN. In fact, the aforementioned statements were formulated and sent via a web survey and answered by almost one hundred IPEN professionals who work on R&D projects. The following dissertation, along with the recommendations at the end, was included to serve as contribution to work developed by the IPEN Project Offices. The evaluation model included in this paper can be applied in other R&D organizations in Brazil, to evaluate the way their projects are managed.
152

Multivariate non-invasive measurements of skin disorders

Nyström, Josefina January 2006 (has links)
<p>The present thesis proposes new methods for obtaining objective and accurate diagnoses in modern healthcare. Non-invasive techniques have been used to examine or diagnose three different medical conditions, namely neuropathy among diabetics, radiotherapy induced erythema (skin redness) among breast cancer patients and diagnoses of cutaneous malignant melanoma. The techniques used were Near-InfraRed spectroscopy (NIR), Multi Frequency Bio Impedance Analysis of whole body (MFBIA-body), Laser Doppler Imaging (LDI) and Digital Colour Photography (DCP).</p><p>The neuropathy for diabetics was studied in papers I and II. The first study was performed on diabetics and control subjects of both genders. A separation was seen between males and females and therefore the data had to be divided in order to obtain good models. NIR spectroscopy was shown to be a viable technique for measuring neuropathy once the division according to gender was made. The second study on diabetics, where MFBIA-body was added to the analysis, was performed on males exclusively. Principal component analysis showed that healthy reference subjects tend to separate from diabetics. Also, diabetics with severe neuropathy separate from persons less affected.</p><p>The preliminary study presented in paper III was performed on breast cancer patients in order to investigate if NIR, LDI and DCP were able to detect radiotherapy induced erythema. The promising results in the preliminary study motivated a new and larger study. This study, presented in papers IV and V, intended to investigate the measurement techniques further but also to examine the effect that two different skin lotions, Essex and Aloe vera have on the development of erythema. The Wilcoxon signed rank sum test showed that DCP and NIR could detect erythema, which is developed during one week of radiation treatment. LDI was able to detect erythema developed during two weeks of treatment. None of the techniques could detect any differences between the two lotions regarding the development of erythema.</p><p>The use of NIR to diagnose cutaneous malignant melanoma is presented as unpublished results in this thesis. This study gave promising but inconclusive results. NIR could be of interest for future development of instrumentation for diagnosis of skin cancer.</p>
153

Explorative Multivariate Data Analysis of the Klinthagen Limestone Quarry Data / Utforskande multivariat analys av Klinthagentäktens projekteringsdata

Bergfors, Linus January 2010 (has links)
<p> </p><p>The today quarry planning at Klinthagen is rough, which provides an opportunity to introduce new exciting methods to improve the quarry gain and efficiency. Nordkalk AB, active at Klinthagen, wishes to start a new quarry at a nearby location. To exploit future quarries in an efficient manner and ensure production quality, multivariate statistics may help gather important information.</p><p>In this thesis the possibilities of the multivariate statistical approaches of Principal Component Analysis (PCA) and Partial Least Squares (PLS) regression were evaluated on the Klinthagen bore data. PCA data were spatially interpolated by Kriging, which also was evaluated and compared to IDW interpolation.</p><p>Principal component analysis supplied an overview of the variables relations, but also visualised the problems involved when linking geophysical data to geochemical data and the inaccuracy introduced by lacking data quality.</p><p>The PLS regression further emphasised the geochemical-geophysical problems, but also showed good precision when applied to strictly geochemical data.</p><p>Spatial interpolation by Kriging did not result in significantly better approximations than the less complex control interpolation by IDW.</p><p>In order to improve the information content of the data when modelled by PCA, a more discrete sampling method would be advisable. The data quality may cause trouble, though with sample technique of today it was considered to be of less consequence.</p><p>Faced with a single geophysical component to be predicted from chemical variables further geophysical data need to complement existing data to achieve satisfying PLS models.</p><p>The stratified rock composure caused trouble when spatially interpolated. Further investigations should be performed to develop more suitable interpolation techniques.</p>
154

Essays on nonlinear time series analysis and health economics

Ovanfors, Anna January 2006 (has links)
Diss. Stockholm : Handelshögskolan, 2006 S. 1-125 : 4 uppsatser
155

Explorative Multivariate Data Analysis of the Klinthagen Limestone Quarry Data / Utforskande multivariat analys av Klinthagentäktens projekteringsdata

Bergfors, Linus January 2010 (has links)
The today quarry planning at Klinthagen is rough, which provides an opportunity to introduce new exciting methods to improve the quarry gain and efficiency. Nordkalk AB, active at Klinthagen, wishes to start a new quarry at a nearby location. To exploit future quarries in an efficient manner and ensure production quality, multivariate statistics may help gather important information. In this thesis the possibilities of the multivariate statistical approaches of Principal Component Analysis (PCA) and Partial Least Squares (PLS) regression were evaluated on the Klinthagen bore data. PCA data were spatially interpolated by Kriging, which also was evaluated and compared to IDW interpolation. Principal component analysis supplied an overview of the variables relations, but also visualised the problems involved when linking geophysical data to geochemical data and the inaccuracy introduced by lacking data quality. The PLS regression further emphasised the geochemical-geophysical problems, but also showed good precision when applied to strictly geochemical data. Spatial interpolation by Kriging did not result in significantly better approximations than the less complex control interpolation by IDW. In order to improve the information content of the data when modelled by PCA, a more discrete sampling method would be advisable. The data quality may cause trouble, though with sample technique of today it was considered to be of less consequence. Faced with a single geophysical component to be predicted from chemical variables further geophysical data need to complement existing data to achieve satisfying PLS models. The stratified rock composure caused trouble when spatially interpolated. Further investigations should be performed to develop more suitable interpolation techniques.
156

Identifying Factors Influencing The Acceptance Of Processes: An Empirical Investigation Using The Structural Equation Modeling Approach

Degerli, Mustafa 01 July 2012 (has links) (PDF)
In this research, it was mainly aimed to develop an acceptance model for processes, namely the process acceptance model (PAM). For this purpose, a questionnaire, comprising 3-part and 81-question, was developed to collect quantitative and qualitative data from people having relationships with certain process-focused models and/or standards (CMMI, ISO 15504, ISO 9001, ISO 27001, AQAP-160, AQAP-2110, and/or AS 9100). To revise and refine the questionnaire, expert reviews were ensured, and a pilot study was conducted with 60 usable responses. After reviews, refinements and piloting, the questionnaire was deployed to collect data and in-total 368 usable responses were collected from the people. Here, collected data were screened concerning incorrectly entered data, missing data, outliers and normality, and reliability and validity of the questionnaire were ensured. Partial least squares structural equation modeling (PLS SEM) was applied to develop the PAM. In this context, exploratory and confirmatory factor analyses were applied, and the initial model was estimated and evaluated. The initial model was modified as required by PLS SEM, and confirmatory factor analysis was repeated, and the modified final model was estimated and evaluated. Consequently, the PAM, with 18 factors and their statistically significant relationships, was developed. Furthermore, descriptive statistics and t-tests were applied to discover some interesting, meaningful, and important points to be taken into account regarding the acceptance of processes. Moreover, collected quantitative data were analyzed, and three additional factors were discovered regarding the acceptance of processes. Besides, a checklist to test and/or promote the acceptance of processes was established.
157

Multivariate data analysis using spectroscopic data of fluorocarbon alcohol mixtures / Nothnagel, C.

Nothnagel, Carien January 2012 (has links)
Pelchem, a commercial subsidiary of Necsa (South African Nuclear Energy Corporation), produces a range of commercial fluorocarbon products while driving research and development initiatives to support the fluorine product portfolio. One such initiative is to develop improved analytical techniques to analyse product composition during development and to quality assure produce. Generally the C–F type products produced by Necsa are in a solution of anhydrous HF, and cannot be directly analyzed with traditional techniques without derivatisation. A technique such as vibrational spectroscopy, that can analyze these products directly without further preparation, will have a distinct advantage. However, spectra of mixtures of similar compounds are complex and not suitable for traditional quantitative regression analysis. Multivariate data analysis (MVA) can be used in such instances to exploit the complex nature of spectra to extract quantitative information on the composition of mixtures. A selection of fluorocarbon alcohols was made to act as representatives for fluorocarbon compounds. Experimental design theory was used to create a calibration range of mixtures of these compounds. Raman and infrared (NIR and ATR–IR) spectroscopy were used to generate spectral data of the mixtures and this data was analyzed with MVA techniques by the construction of regression and prediction models. Selected samples from the mixture range were chosen to test the predictive ability of the models. Analysis and regression models (PCR, PLS2 and PLS1) gave good model fits (R2 values larger than 0.9). Raman spectroscopy was the most efficient technique and gave a high prediction accuracy (at 10% accepted standard deviation), provided the minimum mass of a component exceeded 16% of the total sample. The infrared techniques also performed well in terms of fit and prediction. The NIR spectra were subjected to signal saturation as a result of using long path length sample cells. This was shown to be the main reason for the loss in efficiency of this technique compared to Raman and ATR–IR spectroscopy. It was shown that multivariate data analysis of spectroscopic data of the selected fluorocarbon compounds could be used to quantitatively analyse mixtures with the possibility of further optimization of the method. The study was a representative study indicating that the combination of MVA and spectroscopy can be used successfully in the quantitative analysis of other fluorocarbon compound mixtures. / Thesis (M.Sc. (Chemistry))--North-West University, Potchefstroom Campus, 2012.
158

Multivariate data analysis using spectroscopic data of fluorocarbon alcohol mixtures / Nothnagel, C.

Nothnagel, Carien January 2012 (has links)
Pelchem, a commercial subsidiary of Necsa (South African Nuclear Energy Corporation), produces a range of commercial fluorocarbon products while driving research and development initiatives to support the fluorine product portfolio. One such initiative is to develop improved analytical techniques to analyse product composition during development and to quality assure produce. Generally the C–F type products produced by Necsa are in a solution of anhydrous HF, and cannot be directly analyzed with traditional techniques without derivatisation. A technique such as vibrational spectroscopy, that can analyze these products directly without further preparation, will have a distinct advantage. However, spectra of mixtures of similar compounds are complex and not suitable for traditional quantitative regression analysis. Multivariate data analysis (MVA) can be used in such instances to exploit the complex nature of spectra to extract quantitative information on the composition of mixtures. A selection of fluorocarbon alcohols was made to act as representatives for fluorocarbon compounds. Experimental design theory was used to create a calibration range of mixtures of these compounds. Raman and infrared (NIR and ATR–IR) spectroscopy were used to generate spectral data of the mixtures and this data was analyzed with MVA techniques by the construction of regression and prediction models. Selected samples from the mixture range were chosen to test the predictive ability of the models. Analysis and regression models (PCR, PLS2 and PLS1) gave good model fits (R2 values larger than 0.9). Raman spectroscopy was the most efficient technique and gave a high prediction accuracy (at 10% accepted standard deviation), provided the minimum mass of a component exceeded 16% of the total sample. The infrared techniques also performed well in terms of fit and prediction. The NIR spectra were subjected to signal saturation as a result of using long path length sample cells. This was shown to be the main reason for the loss in efficiency of this technique compared to Raman and ATR–IR spectroscopy. It was shown that multivariate data analysis of spectroscopic data of the selected fluorocarbon compounds could be used to quantitatively analyse mixtures with the possibility of further optimization of the method. The study was a representative study indicating that the combination of MVA and spectroscopy can be used successfully in the quantitative analysis of other fluorocarbon compound mixtures. / Thesis (M.Sc. (Chemistry))--North-West University, Potchefstroom Campus, 2012.
159

政府績效管理資訊化的交易成本分析:以「政府計畫管理資訊網」為例 / Information and communication technologies (ICTs) and government performance management: A case study of GPMnet in Taiwan

謝叔芳, Hsieh, Hsu Fang Unknown Date (has links)
自1980年代政府再造潮流以來,績效管理及資訊通信技術業已成為政府提昇績效的重要工具,在此一背景下,我國亦於民國94年完成「政府計畫管理資訊網(GPMnet)」整合,用以協助執行績效管理作業。不過,由於資訊科技涵蓋面向相當寬廣,影響層面頗為廣泛,因此也引發樂觀、悲觀及務實主義等不同立場的爭辯,其運用成效確實有待進一步的評估。在相關文獻的基礎上,本研究採用交易成本理論途徑,首先透過問卷調查瞭解GPMnet使用者的態度及行為偏好,其次則經由訪談資料進一步解析資訊通信科技對於政府績效管理成本的增加與減少。 本研究採取混合方法論(mixed methodology)進行研究設計,兼採量化資料及質化資料蒐集分析。量化資料部分,以GPMnet使用者為分析單位進行問卷調查,回收148份有效樣本;質化資料部分,依主辦、主管、會審及研考等4項權限功能,選取8位GPMnet使用者進行訪談,以了解不同權限受訪者使用GPMnet的經驗與看法。 資料分析部分,本研究以偏最小平方法分析問卷資料,調查結果分析顯示,GPMnet系統使用的交易成本認知與態度、主觀系統績效有顯著負向關係;不確定性、資產專屬、使用頻率與交易成本之假設則未獲實證資料支持。此外,訪談資料分析發現,制度環境下,因受限於現行不同機關有不同資訊系統、GPMnet多個子系統,以及紙本流程仍然存在的情況下,使用GPMnet執行績效管理作業會增加行政成本負擔;此外,在實際使用的情形之下,因為系統可以保存過去資料、提供清楚欄位、網路化傳遞、進行進度控管及主動公開資訊等功能,減少了行政作業交易成本。相對的,也造成學習時間不符成本、溝通費時、校對、資訊過載、介面不友善及系統不穩定等負面影響,增加績效管理作業的交易成本。 最後,本研究建議在學術研究上,結構模式的觀察變項應更謹慎設計,資訊系統評估理論應重視成本觀點。至於在實務面則應全面落實電子化績效管理,在GPMnet系統資源環境更應進行資料備份,以減少資訊的過度負荷。 / Governments invest much more attention, time, and money on performance management and evaluation on the public sector today than ever before. To better utilize agency program management systems under the Executive Yuan, the Research, Development and Evaluation Commission (RDEC) has completed the planning of the "Policy Program Management Information System" (Government Program network, GPMnet). The system is a common service platform created to integrate various policy implementation management information systems to enhance the performance of different agencies in program management. However, the performance of GPMnet needs to be evaluated. In order to evaluate the system, this study introduces an empirical research which focuses on a transaction cost approach that has often been used to support the idea of information and communication technology and its positive impact on the economic system. The data was collected by mixed methodology, combining quantitative data from 148 users and eight interviews with a semi-structured questionnaire. The Partial Least Squares was used to analyze the quantitative data. According to the research findings, information-related problems represent only some of the elements contributing to the transaction costs. These costs also emerge due to the institutional factors contributing to their growths. The study of the consequences associated with ICT design and its implementation, based on the transaction cost theory, should therefore consider the costs of ICTs.
160

In silico tools in risk assessment : of industrial chemicals in general and non-dioxin-like PCBs in particular

Stenberg, Mia January 2012 (has links)
Industrial chemicals in European Union produced or imported in volumes above 1 tonne annually, necessitate a registration within REACH. A common problem, concerning these chemicals, is deficient information and lack of data for assessing the hazards posed to human health and the environment. Animal studies for the type of toxicological information needed are both expensive and time consuming, and to that an ethical aspect is added. Alternative methods to animal testing are thereby requested. REACH have called for an increased use of in silico tools for non-testing data as structure-activity relationships (SARs), quantitative structure-activity relationships (QSARs), and read-across. The main objective of the studies underlying this thesis is related to explore and refine the use of in silico tools in a risk assessment context of industrial chemicals. In particular, try to relate properties of the molecular structure to the toxic effect of the chemical substance, by using principles and methods of computational chemistry. The initial study was a survey of all industrial chemicals; the Industrial chemical map was created. A part of this map was identified including chemicals of potential concern. Secondly, the environmental pollutants, polychlorinated biphenyls (PCBs) were examined and in particular the non-dioxin-like PCBs (NDL-PCBs). A set of 20 NDL-PCBs was selected to represent the 178 PCB congeners with three to seven chlorine substituents. The selection procedure was a combined process including statistical molecular design for a representative selection and expert judgements to be able to include congeners of specific interest. The 20 selected congeners were tested in vitro in as much as 17 different assays. The data from the screening process was turned into interpretable toxicity profiles with multivariate methods, used for investigation of potential classes of NDL-PCBs. It was shown that NDL-PCBs cannot be treated as one group of substances with similar mechanisms of action. Two groups of congeners were identified. A group including in general lower chlorinated congeners with a higher degree of ortho substitution showed a higher potency in more assays (including all neurotoxic assays). A second group included abundant congeners with a similar toxic profile that might contribute to a common toxic burden. To investigate the structure-activity pattern of PCBs effect on DAT in rat striatal synaptosomes, ten additional congeners were selected and tested in vitro. NDL-PCBs were shown to be potent inhibitors of DAT binding. The congeners with highest DAT inhibiting potency were tetra- and penta-chlorinated with 2-3 chlorine atoms in ortho-position. The model was not able to distinguish the congeners with activities in the lower μM range, which could be explained by a relatively unspecific response for the lower ortho chlorinated PCBs. / Den europeiska kemikalielagstiftningen REACH har fastställt att kemikalier som produceras eller importeras i en mängd över 1 ton per år, måste registreras och riskbedömmas. En uppskattad siffra är att detta gäller för 30 000 kemikalier. Problemet är dock att data och information ofta är otillräcklig för en riskbedömning. Till stor del har djurförsök använts för effektdata, men djurförsök är både kostsamt och tidskrävande, dessutom kommer den etiska aspekten in. REACH har därför efterfrågat en undersökning av möjligheten att använda in silico verktyg för att bidra med efterfrågad data och information. In silico har en ungefärlig betydelse av i datorn, och innebär beräkningsmodeller och metoder som används för att få information om kemikaliers egenskaper och toxicitet. Avhandlingens syfte är att utforska möjligheten och förfina användningen av in silico verktyg för att skapa information för riskbedömning av industrikemikalier. Avhandlingen beskriver kvantitativa modeller framtagna med kemometriska metoder för att prediktera, dvs förutsäga specifika kemikaliers toxiska effekt. I den första studien (I) undersöktes 56 072 organiska industrikemikalier. Med multivariata metoder skapades en karta över industrikemikalierna som beskrev dess kemiska och fysikaliska egenskaper. Kartan användes för jämförelser med kända och potentiella miljöfarliga kemikalier. De mest kända miljöföroreningarna visade sig ha liknande principal egenskaper och grupperade i kartan. Genom att specialstudera den delen av kartan skulle man kunna identifiera fler potentiellt farliga kemiska substanser. I studie två till fyra (II-IV) specialstuderades miljögiftet PCB. Tjugo PCBs valdes ut så att de strukturellt och fysiokemiskt representerade de 178 PCB kongenerna med tre till sju klorsubstituenter. Den toxikologiska effekten hos dessa 20 PCBs undersöktes i 17 olika in vitro assays. De toxikologiska profilerna för de 20 testade kongenerna fastställdes, dvs vilka som har liknande skadliga effekter och vilka som skiljer sig åt. De toxicologiska profilerna användes för klassificering av PCBs. Kvantitativa modeller utvecklades för prediktioner, dvs att förutbestämma effekter hos ännu icke testade PCBs, och för att få ytterligare kunskap om strukturella egenskaper som ger icke önskvärda effekter i människa och natur. Information som kan användas vid en framtida riskbedömning av icke-dioxinlika PCBs. Den sista studien (IV) är en struktur-aktivitets studie som undersöker de icke-dioxinlika PCBernas hämmande effekt av signalsubstansen dopamin i hjärnan.

Page generated in 0.0662 seconds