Spelling suggestions: "subject:"met model"" "subject:"meet model""
51 |
Intégration des méthodes de sensibilité d'ordre élevé dans un processus de conception optimale des turbomachines : développement de méta-modèlesZhang, Zebin 15 December 2014 (has links)
La conception optimale de turbomachines repose usuellement sur des méthodes itératives avec des évaluations soit expérimentales, soit numériques qui peuvent conduire à des coûts élevés en raison des nombreuses manipulations ou de l’utilisation intensive de CPU. Afin de limiter ces coûts et de raccourcir les temps de développement, le présent travail propose d’intégrer une méthode de paramétrisation et de métamodélisation dans un cycle de conception d’une turbomachine axiale basse vitesse. La paramétrisation, réalisée par l’étude de sensibilité d’ordre élevé des équations de Navier-Stokes, permet de construire une base de données paramétrée qui contient non seulement les résultats d’évaluations, mais aussi les dérivées simples et les dérivées croisées des objectifs en fonction des paramètres. La plus grande quantité d’informations apportée par les dérivées est avantageusement utilisée lors de la construction de métamodèles, en particulier avec une méthode de Co-Krigeage employée pour coupler plusieurs bases de données. L’intérêt économique de la méthode par rapport à une méthode classique sans dérivée réside dans l’utilisation d’un nombre réduit de points d’évaluation. Lorsque ce nombre de points est véritablement faible, il peut arriver qu’une seule valeur de référence soit disponible pour une ou plusieurs dimensions, et nécessite une hypothèse de répartition d’erreur. Pour ces dimensions, le Co-Krigeage fonctionne comme une extrapolation de Taylor à partir d’un point et de ses dérivées. Cette approche a été expérimentée avec la construction d’un méta-modèle pour une hélice présentant un moyeu conique. La méthodologie fait appel à un couplage de bases de données issues de deux géométries et deux points de fonctionnement. La précision de la surface de réponse a permis de conduire une optimisation avec un algorithme génétique NSGA-2, et les deux optima sélectionnés répondent pour l’un à une maximisation du rendement, et pour l’autre à un élargissement de la plage de fonctionnement. Les résultats d’optimisation sont finalement validés par des simulations numériques supplémentaires. / The turbomachinery optimal design usually relies on some iterative methods with either experimental or numerical evaluations that can lead to high cost due to numerous manipulations and intensive usage of CPU. In order to limit the cost and shorten the development time, the present thesis work proposes to integrate a parameterization method and the meta-modelization method in an optimal design cycle of an axial low speed turbomachine. The parameterization, realized by the high order sensitivity study of Navier-Stokes equations, allows to construct a parameterized database that contains not only the evaluations results, but also the simple and cross derivatives of objectives as a function of parameters. Enriched information brought by the derivatives are utilized during the meta-model construction, particularly by the Co-Kriging method employed to couple several databases. Compared to classical methods that are without derivatives, the economic benefit of the proposed method lies in the use of less reference points. Provided the number of reference points is small, chances are a unique point presenting at one or several dimensions, which requires a hypothesis on the error distribution. For those dimensions, the Co-Kriging works like a Taylor extrapolation from the reference point making the most of its derivatives. This approach has been experimented on the construction of a meta-model for a conic hub fan. The methodology recalls the coupling of databases based on two fan geometries and two operating points. The precision of the meta-model allows to perform an optimization with help of NSGA-2, one of the optima selected reaches the maximum efficiency, and another covers a large operating range. The optimization results are eventually validated by further numerical simulations.
|
52 |
An approach to automate the adaptor software generation for tool integration in Application/ Product Lifecycle Management tool chains.Singh, Shikhar January 2016 (has links)
An emerging problem in organisations is that there exist a large number of tools storing data that communicate with each other too often, throughout the process of an application or product development. However, no means of communication without the intervention of a central entity (usually a server) or storing the schema at a central repository exist. Accessing data among tools and linking them is tough and resource intensive. As part of the thesis, we develop a software (also referred to as ‘adaptor’ in the thesis), which, when implemented in the lifecycle management systems, integrates data seamlessly. This will eliminate the need of storing database schemas at a central repository and make the process of accessing data within tools less resource intensive. The adaptor acts as a wrapper to the tools and allows them to directly communicate with each other and exchange data. When using the developed adaptor for communicating data between various tools, the data in relational databases is first converted into RDF format and is then sent or received. Hence, RDF forms the crucial underlying concept on which the software will be based. The Resource description framework (RDF) provides the functionality of data integration irrespective of underlying schemas by treating data as resource and representing it as URIs. The model of RDF is a data model that is used for exchange and communication of data on the Internet and can be used in solving other real world problems like tool integration and automation of communication in relational databases. However, developing this adaptor for every tool requires understanding the individual schemas and structure of each of the tools’ database. This again requires a lot of effort for the developer of the adaptor. So, the main aim of the thesis will be to automate the development of such adaptors. With this automation, the need for anyone to manually assess the database and then develop the adaptor specific to the database is eliminated. Such adaptors and concepts can be used to implement similar solutions in other organisations faced with similar problems. In the end, the output of the thesis is an approachwhich automates the process of generating these adaptors. / Resource Description Framework (RDF) ger funktionaliteten av dataintegration, oberoende av underliggande scheman genom att behandla uppgifter som resurs och representerar det som URI. Modellen för Resource Description Framework är en datamodell som används för utbyte och kommunikation av uppgifter om Internet och kan användas för att lösa andra verkliga problem som integrationsverktyg och automatisering av kommunikation i relationsdatabaser. Ett växande problem i organisationer är att det finns ett stort antal verktyg som lagrar data och som kommunicerar med varandra alltför ofta, under hela processen för ett program eller produktutveckling. Men inga kommunikationsmedel utan ingripande av en central enhet (oftast en server) finns. Åtkomst av data mellan verktyg och länkningar mellan dem är resurskrävande. Som en del av avhandlingen utvecklar vi en programvara (även hänvisad till som "adapter" i avhandlingen), som integrerar data utan större problem. Detta kommer att eliminera behovet av att lagra databasscheman på en central lagringsplats och göra processen för att hämta data inom verktyg mindre resurskrävande. Detta kommer att ske efter beslut om en särskild strategi för att uppnå kommunikation mellan olika verktyg som kan vara en sammanslagning av många relevanta begrepp, genom studier av nya och kommande metoder som kan hjälpa i nämnda scenarier. Med den utvecklade programvaran konverteras först datat i relationsdatabaserna till RDF form och skickas och tas sedan emot i RDF format. Således utgör RDF det viktiga underliggande konceptet för programvaran. Det främsta målet med avhandlingen är att automatisera utvecklingen av ett sådant verktyg (adapter). Med denna automatisering elimineras behovet att av någon manuellt behöver utvärdera databasen och sedan utveckla adaptern enligt databasen. Ett sådant verktyg kan användas för att implementera liknande lösningar i andra organisationer som har liknande problem. Således är resultatet av avhandlingen en algoritm eller ett tillvägagångssätt för att automatisera processen av att skapa adaptern.
|
53 |
系統功能演化之需求分析方法論 / Requirements Analysis Methodology for System Functional Evolution劉季綸, Liu, Chi-Lun Unknown Date (has links)
在重視集體智慧、重視服務、且需要因應環境快速變遷的年代,傳統的系統開發方法論雖然有其不可磨滅的價值,但已經顯露出其不足之處。為了順應時代的潮流,方法論必須指引企業去聆聽大眾的心聲,以期確保系統提供優質的服務,方法論也必須指引資訊人員運用有異於以往的手法與步驟,與其他部門和大眾共同合作,來持續不斷地維護系統,使得系統得以注入新的生命力而不斷演進。
為了提出一套不斷吸納使用者的新需求來規劃系統演進的功能需求分析方法論,本研究以哲學詮釋學為基礎,並佐以軟體工程相關文獻,將持續吸納新需求來促進系統演進的抽象概念,化為具體可執行的步驟。本需求分析方法論是針對使用者提議的需求進行初步分析與確認,可分為兩個主要部分:需求形成流程、與衝突處理流程。需求形成流程是分析使用者所提出之功能性需求的主要方式,其中包含了提出新需求、分析新需求在商業活動與科技層次的影響、估計新需求的成本效益來決定是否實作、排序實作的優先權、並且了解新功能的釋出時間的期望。而衝突解決流程是為了解決各方人馬的歧見所造成的爭端,衝突解決的方式包含了自行協商、第三方中間人介入協調、以及高層決策小組的裁決。
為了讓企業外界的大眾提議新需求,本研究發展了一套以部落格為基礎的新需求提議工具,讓網友可在部落格上提出自己對新功能的想法。此外,為了協助企業判斷新需求是否會觸發衝突解決流程,本研究根據哲學詮釋學,將使用案例(Use cases)加以延伸修改,提出一套後設模型,並輔以知識本體,據此來提出一套規則,讓本工具能自動偵測新需求與系統既有設計之間是否有所抵觸,而規則亦可進一步應用在新需求間的衝突上。
為了初步瞭解本研究所提之方法論與工具的優缺點,本研究與中時電子報和民視購物網合作,來試用此方法論與工具。透過試用之後的訪談得知,本方法論與工具有其價值,而也獲得了不少寶貴的試用意見。最後,本研究根據試用的諸多意見,對方法論與工具的改善上,提出了具體的改良作法與方向。 / Nowadays, companies have to respect collective knowledge and improve service quality for adapt their rapidly changing environment. Traditional systems development methodologies may be still valuable but have shortcomings. To accommodate customer-driven trend, new methodologies must guide enterprises to listen to customers for ensuring high-quality system services. New methodologies also have to guide developers to carry out cross-department and customer-centered collaboration in new ways for maintaining systems cyclically.
This research proposes a user requirements analysis methodology according on philosophical hermeneutics and software engineering literature. The proposed methodology includes requirements formation and conflict resolution. Requirements formation process involves new user requirement proposition, commercial and technical impact analysis, cost benefit estimation, coding prioritization, and new version release scheduling. Conflict resolution process involves negotiation, mediation, and arbitration.
Besides the proposed methodology, this research also develops a blog-based tool for collecting user requirements on Internet. This research extends and modifies use cases diagram and use philosophical hermeneutics as a foundation to propose a meta-model. This research also proposes a set of rules for conflict detection. Base on the proposed meta-model, ontologies, and the proposed rules, the blog-based tool can automatically detect conflicts between new requirements and existing design. These proposed rules also can apply to detect conflicts among new requirements.
An online newspaper company and an online shopping mall try to use this methodology and the blog-based tool. In the interviews, they confirm this methodology’s and tool’s values and give several suggestions for improving the methodology and the tool. Finally, this research discusses the improvements and future research directions according to these suggestions.
|
54 |
Gestion des incertitudes dans le processus de développement de systèmes complexes / Uncertainty management in complexed development processXiao, Jing 04 December 2009 (has links)
La gestion des incertitudes constitue un sujet particulièrement important pour le processus de développement des systèmes (DS) complexes. Il faut donc améliorer la flexibilité et la prédictibilité des processus de DS en proposant une méthodologie pour appréhender et maîtriser les incertitudes qu’ils comportent, ce qui passe par la modélisation. Le travail présenté dans ce mémoire se concentre sur deux questions indispensables dans ce but : (1) Comment gérer les incertitudes pendant le processus de DS ? (2) Comment représenter les incertitudes dans la modélisation de processus de DS ? Pour répondre à ces questions, la thèse propose une méthodologie incluant l’identification des incertitudes, l’analyse de leurs effets, leur réduction et/ou leur exploitation, et leur surveillance et leur contrôle ; elle définit également une démarche de modélisation des incertitudes avec le langage UML en proposant un méta-modèle des incertitudes et un nouveau profil UML "Uncertainty-of" qui permet de rendre ce méta-modèle opérable avec les outils UML. La validité de la méthodologie et du profil proposés a été testée par l’étude d’un cas industriel. Cette étude a montré que la définition d’une telle méthodologie était utile pour l’organisation de la gestion des incertitudes dans la conception et le développement de systèmes, et que la création d’un nouveau profil UML apportait une solution pertinente dans la représentation des incertitudes. / Uncertainty management is a particularly important issue for complex systems development (SD) processes. The overall objective to be attained in this context is to improve the flexibility and predictability of SD process by proposing a methodology to understand and master the uncertainties. We believe that modeling uncertainties is an indispensable step in this direction. For this purpose, the presented work in this thesis focuses on two critical issues : (1) How should we manage uncertainties in SD processes ? (2) How can we represent uncertainties in modeling such processes ? In order to answer to these questions, this thesis proposes a methodology including identification, analysis of impact, diminution and / or exploitation, and monitoring and control of uncertainties ; and it also defines an approach to modeling uncertainties with UML by proposing a meta-model of uncertainties and a new UML profile named "Uncertaintyof", which allows to make this meta-model operable with UML tools. The validity of our proposals was tested by an industrial case study, which showed that the definition of such a methodology is helpful to manage the uncertainties in systems design and development, and that the creation of a new UML profile provides a valid solution in the representation of uncertainties.
|
55 |
Équation d’état de la matière à densité supranucléaire et application à l’émission thermique des étoiles compactes / Equation of state of matter at supra-nuclear density and application to the thermal emission of neutron starsBaillot d'Étivaux, Nicolas 04 October 2018 (has links)
Cette thèse porte sur la modélisation théorique de l'équation d’état (EE) décrivant la matière nucléaire présente dans le coeur des étoiles à neutrons (EN), sous l'hypothèse qu'aucune transition de phase ne s'y produise. Nous utilisons un méta-modèle permettant i) d’incorporer directement les connaissances en physique nucléaire sous la forme de paramètres empiriques tels que la densité de saturation nucléaire, l’incompressibilité, l’énergie de symétrie; ii) de reproduire la plupart des modèles nucléoniques existants; et iii) d’explorer les régions inconnues à haute densité de façon la plus large possible. Pour chaque EE, nous déterminons un ensemble de solutions pour la masse et le rayon des EN, et nous effectuons une première sélection des EE compatibles avec la stabilité et la causalité de la matière nucléaire, ainsi que la masse maximale connues des EN. Nous confrontons ensuite ces EE aux observations d’émission thermique dans la gamme des rayons-X pour 7 EN soigneusement choisies. Pour la première fois, la modélisation théorique des EE est directement introduite dans l’analyse des données. Nous utilisons les dernières mesures effectuées par GAIA II pour fixer la distance des EN. Les paramètres du modèle d’émission thermique et de l’EE sont déterminés selon une méthode Bayésienne basée sur un algorithme Monte-Carlo par Chaîne de Markov. Nous déterminons ainsi la température de surface, la masse et le rayon des EN, ainsi que sur la valeur de certains paramètres empiriques tels que la dépendance en densité de l'énergie de symétrie (Lsym), la contribution isovectorielle au module d’incompressibilité (Ksym) ou encore le paramètre de distorsion isoscalaire (Qsat) / This thesis concerns theoretical modeling of the equation of state (EoS) describing nuclear matter in the core of neutron stars (NS), under the hypothesis that no phase transition occurs. We use a meta-model which is able to i) directly incorporate nuclear physics knowledge on the form of empirical parameters such as the nuclear saturation density, the incompressibility or the symmetry energy; ii) reproduce most of the existing models; iii) explore new behaviors at high densities in a very flexible way. For each EoS, we determine a set of solutions for the masses and radii of NS, and we make a first selection of the EoS that are compatible with the stability and causality constraints, as well as the maximum observed mass of NS. Then we confront these EoS to observational data coming from thermal emission in the soft X-ray domain, for 7 NS carefully chosen. For the first time, the theoretical modeling of the EoS is directly implemented in the data analysis. We use the recent measurments of GAIA II to constrain the distance to the NS. The parameters of the modeling of thermal emission as well as the empirical parameters entering in the EoS are determined by Bayesian methods using a Monte-Carlo by Morkov Chain algorithm. Therefore, we determine the surface effective tempreature, the masses and radii of NS, as well as some empirical nuclear parameters such as the density dependance of the symmetry energy (Lsym), the isovector incompressibility (Ksym), or the isoscalar squewness (Qsat)
|
56 |
Assuring Quality in Web-Based Information Systems : A quality-driven design modelTheodorsson, Mike, Rydiander, Ida January 2007 (has links)
<p>Quality has always been an elusive concept in information system design, even more so when dealing with the online-sphere. The purpose of this thesis is to investigate and dissect the concept of quality and present a practical way to apply it to the design process. To do this, a quality-driven design model has been developed, focused around four crucial aspects of overall quality: accessibility, usability, navigation and interactivity. By utilizing simple and practical techniques and measuring their success in achieving quality, this study attempts to prove that quality can be harnessed as a hands-on resource, guaranteed in any design project involving a Web-Based Information System, merely by applying this model.</p> / <p>Kvalitet har alltid varit ett begrepp svårt att fånga och tillämpa, speciellt när området Internet kommer in i bilden. Den här C-uppsatsen har således som syfte att undersöka och bryta ned kvalitetskonceptet samt presentera ett praktiskt tillvägagångsätt som kan utnyttjas under design processen. För att göra detta så har vi utvecklat en kvalitetsdriven designmodell, fokuserad runt fyra kvalitetsaspekter: tillgänglighet, användbarhet, navigation och interaktion. Genom att utnyttja praktiska och enkla tekniker och genom att mäta deras effektivitet i att utveckla kvalitet, så försöker den här studien att bevisa att kvalitet kan användas som en praktisk resurs. Kvalitet kan garanteras i alla designprojekt som involverar ett web-baserat informationssystem, genom användning av denna modell.</p>
|
57 |
An Application Framework for Monitoring Care ProcessesBaarah, Aladdin 17 December 2013 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process.
A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity.
This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern.
Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
|
58 |
Um framework baseado em modelos para desenvolvimento de sistemas multim?dia distribu?dos autoadaptativosPinto, Felipe Alves Pereira 19 January 2011 (has links)
Made available in DSpace on 2014-12-17T15:47:53Z (GMT). No. of bitstreams: 1
FelipeAPP_DISSERT.pdf: 2094888 bytes, checksum: 2f662bc70058d8584f0a23ab28517f43 (MD5)
Previous issue date: 2011-01-19 / Distributed multimedia systems have highly variable characteristics, resulting in new requirements while new technologies become available or in the need for adequacy in accordance with the amount of available resources. So, these systems should provide support for dynamic adaptations in order to adjust their structures and behaviors at runtime. This paper presents an approach to adaptation model-based and proposes a reflective and component-based framework for construction and support of self-adaptive distributed multimedia systems, providing many facilities for the development and evolution of such systems, such as dynamic adaptation. The propose is to keep one or more models to represent the system at runtime, so some external entity can perform an analysis of these models by identifying problems and trying to solve them. These models integrate the reflective meta-level, acting as a system self-representation. The framework defines a meta-model for description of self-adaptive distributed multimedia applications, which can represent components and their relationships, policies for QoS specification and adaptation actions. Additionally, this paper proposes an ADL and architecture for model-based adaptation. As a case study, this paper presents some scenarios to demonstrate the application of the framework in practice, with and without the use of ADL, as well as check some characteristics related to dynamic adaptation / Sistemas multim?dia distribu?dos possuem caracter?sticas bastante vari?veis, podendo implicar em novos requisitos ? medida que novas tecnologias s?o disponibilizadas ou na necessidade de adequa??o de acordo com a quantidade de recursos dispon?veis. Dessa forma, tais aplica??es devem ser capazes de realizar ajustes e adapta??es din?micas, permitindo alterar sua estrutura e comportamento. Com o objetivo de prover capacidades de autoadapta??o nesses sistemas, este trabalho apresenta uma abordagem de adapta??o baseada em modelos. ? proposto um framework reflexivo e baseado em componentes para a constru??o e suporte de sistemas multim?dia distribu?dos autoadaptativos, fornecendo v?rias facilidades para o desenvolvimento e evolu??o de tais sistemas, como adapta??o din?mica. A proposta do framework ? manter um ou mais modelos de representa??o do sistema em tempo de execu??o, de modo que alguma entidade externa possa realizar uma an?lise desses modelos identificando problemas e tentando resolv?-los. Esses modelos integram o metan?vel reflexivo das aplica??es atuando como uma autorrepresenta??o do sistema. Para isso, o framework define um metamodelo para a descri??o de aplica??es multim?dia distribu?das autoadaptativas, o qual ? capaz de representar componentes e seus relacionamentos, pol?ticas para especifica??o de QoS e a??es de adapta??o. Adicionalmente, ? proposta uma ADL e uma arquitetura para adapta??o baseada em modelos. Como estudo de caso o trabalho apresenta alguns cen?rios buscando exemplificar o uso do framework na pr?tica, com e sem o uso da ADL, bem como verificar algumas caracter?sticas relacionadas ? adapta??o din?mica
|
59 |
Assuring Quality in Web-Based Information Systems : A quality-driven design modelTheodorsson, Mike, Rydiander, Ida January 2007 (has links)
Quality has always been an elusive concept in information system design, even more so when dealing with the online-sphere. The purpose of this thesis is to investigate and dissect the concept of quality and present a practical way to apply it to the design process. To do this, a quality-driven design model has been developed, focused around four crucial aspects of overall quality: accessibility, usability, navigation and interactivity. By utilizing simple and practical techniques and measuring their success in achieving quality, this study attempts to prove that quality can be harnessed as a hands-on resource, guaranteed in any design project involving a Web-Based Information System, merely by applying this model. / Kvalitet har alltid varit ett begrepp svårt att fånga och tillämpa, speciellt när området Internet kommer in i bilden. Den här C-uppsatsen har således som syfte att undersöka och bryta ned kvalitetskonceptet samt presentera ett praktiskt tillvägagångsätt som kan utnyttjas under design processen. För att göra detta så har vi utvecklat en kvalitetsdriven designmodell, fokuserad runt fyra kvalitetsaspekter: tillgänglighet, användbarhet, navigation och interaktion. Genom att utnyttja praktiska och enkla tekniker och genom att mäta deras effektivitet i att utveckla kvalitet, så försöker den här studien att bevisa att kvalitet kan användas som en praktisk resurs. Kvalitet kan garanteras i alla designprojekt som involverar ett web-baserat informationssystem, genom användning av denna modell.
|
60 |
An Application Framework for Monitoring Care ProcessesBaarah, Aladdin January 2014 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process.
A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity.
This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern.
Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
|
Page generated in 0.0619 seconds