Spelling suggestions: "subject:"model devevelopment"" "subject:"model agentdevelopment""
21 |
Digital affärsmodellsutveckling för fysiska butikerBlom, Kristian, Larstorp, Oliver January 2021 (has links)
På senare tid har fysiska butiker hamnat alltmer i skuggan av stora digitala handelsplattformar som CDON eller Amazon och tomma butikslokaler är en allt vanligare syn runtom i Sverige. Som en motreaktion på detta har näringslivsföreningen i Torsby kommun startat ett samarbete där fysiska butiker i kommunen erbjuds möjligheten att ansluta sig till en lokal digital handelsplattform. En av anledningarna till detta är att göra Torsbys butiker mera tillgängliga och på så vis fånga upp kunder som önskar handla lokalt på ett digitalt och bekvämt sätt. Syftet med studien är att beskriva hur en fysisk butiks affärsmodell kan utvecklas för att möta kundernas alltmer digitala köpbeteende. Projektet som beskrivs ovan går under namnet handla.itorsby.se och är hela studiens nav. En relevant frågeställning är förstås om denna digitala handelsplattform kan hjälpa butikerna i Torsby från att tyna bort? Det finns naturligtvis otaliga sätt att angripa frågor som denna på men vi har valt att titta närmare på butikernas affärsmodeller. Att kontinuerligt arbeta med affärsmodellsutveckling är viktigt men åsidosätts ofta bland företag (Chesbrough, 2007). En av anledningarna kan vara att medveten affärsmodellsutveckling är ett relativt outforskat område (Fjeldstad & Snow, 2018) och tillvägagångsätten för denna process är abstrakta och svårförståeliga. Studien finner att den digitala handelsplattformen har potentialen att utveckla och förändra de anslutna butikernas affärsmodeller och således motverka eroderingen av dessa. Vägen dit är dock i skrivande stund lång, detta på grund av två anledningar. Nummer ett: den digitaliseringsgrad som krävs för att butiken ska vara en fungerande del av handelsplattformen är, för de flesta, långt ifrån verklighet. Nummer två: det råder en viss tvetydighet om vad syftet med handelsplattformen verkligen är. Näringslivsföreningen menar att projektet går ut på att utbilda samt digitalisera butikerna i Torsby. Kunderna ser handelsplattformen som vilken digital handelsplats som helst och förväntar sig således att den ska fungera som en sådan. Butikerna pendlar mellan dessa två ytterligheter, vissa ser projektet som en chans att utbilda och digitalisera sin verksamhet andra ser det som ytterligare en säljkanal. Plattformen har, som tidigare nämnts, potentialen att utveckla och förändra butikernas affärsmodeller, men för att detta ska bli verklighet så måste syftet vara mer förankrat hos butikerna själva. Många är helt enkelt inte redo för en sådan förändring. / During recent times physical stores have found it hard to survive, which is made noticeable by all the empty shop premises around the country. The reasons are many, but the increasing competition from internet commerce is one of the more severe ones. Many of our respondents experience this competition when the pressure from digital platforms such as Amazon becomes more palpable. Digital business model development regarding physical stores has been neglected in business model research. Therefore, the purpose of this study is to describe how a physical stores business model can develop to meet the customers' increasingly digital buying behavior. In this study, we have looked closer at the digital business platform: handla.itorsby.se trying to understand if this digital tool can help physical stores to regain some competitiveness. We look at this problem from a business model perspective. To work continuously with business model development is significant but often disregarded. It is crucial since business models lose their relevance as time goes by. The study concludes that the platform is not particularly helpful in developing and changing the physical stores’ business models. There are two main reasons for this. Number one: The degree to which the organizations are up to date digitally. Many of the companies are too far behind digitally to be an active part of the platform. The change needed in their business model is far too radical. These companies' business models have been static for a long time and focused on developing and changing the internal operational processes that are a part of the business. The dynamics of the business model have not changed or developed at all. Therefore, their business models are considered outdated. The second reason is that there is an ambiguity regarding what role the business platform has. From the project management perspective, the goal is to educate and digitalize the stores in Torsby. From a customer perspective, the platform is a commerce platform. Therefore, they want to be able to see the offers and buy their products. The company perspective is somewhere in between. Some want to be educated and helped to digitalize their business, while others see it more as a tool for selling more products and services. The platform has the potential to help companies develop and change their business models. However, for that to happen, the idea must be more established within the companies. Many of the stores are not ready for such a radical change.
|
22 |
Inhibition Kinetics of Hydrogenation of Phenanthrene / Inhiberingskinetik för hydrering av fenantrenJohansson, Johannes January 2019 (has links)
In this thesis work the hydrogenation kinetics of phenanthrene inhibited by the basic nitrogen compound acridine and the non-basic carbazole was investigated. Based on a transient reactor model a steady state plug flow model was developed and kinetic parameters were estimated through nonlinear regression to experimental data. The experimental data was previously collected from hydrotreating of phenanthrene in a bench-scale reactor packed with a commercial NiMo catalyst mixed with SiC. As a first two-step solution, the yields of the hydrogenation products of phenanthrene were predicted as a function of conversion, which subsequently was used to calculate concentration profiles as a function of position in reactor. As a second improved solution, the concentration profiles were calculated directly as a function of residence time, and these results were then used for further analysis. Reaction network 2 in figure 7 was considered sufficient to describe the product distribution of phenanthrene, with a pseudo-first-order rate law for the nitrogen compounds. Both solution methods provided similar results which gave good predictions of the experimental data, with a few exceptions. These cases could be improved by gathering more experimental data or by investigating the effect of some model assumptions. The two-step method thus proved useful in evaluating the phenanthrene reaction network and providing an initial estimate of the parameters, while the onestep method then could give a more precise solution by calculating all parameters simultaneously. As expected, acridine was shown to be more inhibiting than carbazole, both in the produced concentration profiles and estimated parameters. A possible saturation effect was also seen in the inhibition behavior, where adding more nitrogen compounds only had a small additional effect on the phenanthrene conversion. The Mears and Weisz-Prater criteria were found to be inversely proportional to the concentrations of the nitrogen compounds and otherwise only depend on rate constants, with values well below limits for diffusion controlled processes. Sensitivity analyses also supported that the global minimum had been found in the nonlinear regression solution.
|
23 |
Improving Metacomprehension And Learning Through Graduated Concept ModKring, Eleni 01 January 2004 (has links)
Mental model development, deeper levels of information processing, and elaboration are critical to learning. More so, individuals' metacomprehension accuracy is integral to making improvements to their knowledge base. In other words, without an accurate perception of their knowledge on a topic, learners may not know that knowledge gaps or misperceptions exist and, thus, would be less likely to correct them. Therefore, this study offered a dual-process approach that aimed at enhancing metacomprehension. One path aimed at advancing knowledge structure development and, thus, mental model development. The other focused on promoting a deeper level of information processing through processes like elaboration. It was predicted that this iterative approach would culminate in improved metacomprehension and increased learning. Accordingly, using the Graduated Concept Model Development (GCMD) approach, the role of learner-generated concept model development in facilitating metacomprehension and knowledge acquisition was examined. Concept maps have had many roles in the learning process as mental model assessment tools and advanced organizers. However, this study examined the process of concept model building as an effective training tool. Whereas, concept maps functioning as advanced organizers are certainly beneficial, it would seem that the benefits of having a learner examine and amend the current state of their knowledge through concept model development would prove more effective for learning. In other words, learners looking at an advanced organizer of the training material may feel assured that they have a thorough understanding of it. Only when they are forced to create a representation of the material would the gaps and misperceptions in their knowledge base likely be revealed. In short, advanced organizers seem to rely on recognition, where concept model development likely requires recalling and understanding 'how' and 'why' the interrelationships between concepts exist. Therefore, the Graduated Concept Model Development (GCMD) technique offered in this study was based on the theory that knowledge acquisition improves when learners integrate new information into existing knowledge, assign elaborated meanings to concepts, correct misperceptions, close knowledge gaps, and strengthen accurate connections between concepts by posing targeted questions against their existing knowledge structures. This study placed an emphasis on meaningful learning and suggested a process by which newly introduced concepts would be manipulated for the purpose of improving metacomprehension by strengthening accurate knowledge structures and mental model development, and through deeper and elaborated information processing. Indeed, central to improving knowledge deficiencies and misunderstandings is metacomprehension, and the constructing of concepts maps was hypothesized to improve metacomprehension accuracy and, thus, learning. This study was a one-factor between-groups design with concept map type as the independent variable, manipulated at four levels: no concept map, concept map as advanced organizer, learner-built concept map with feedback, and learner-built concept map without feedback. The dependent variables included performance (percent correct) on a declarative and integrative knowledge assessment, mental model development, and metacomprehension accuracy. Participants were 68 (34 female, 34 male, ages 18-35, mean age = 21.43) undergraduate students from a major southeastern university. Participants were randomly assigned to one of the four experimental conditions, and analysis revealed no significant differences between the groups. Upon arrival, participants were randomly assigned to one of the four experimental conditions. Participants then progressed through the three stages of the experiment. In Stage I, participants completed forms regarding informed consent, general biographical information, and task self-efficacy. In Stage II, participants completed the self-paced tutorial based on the Distributed Dynamic Decision Making (DDD) model, a simulated military command and control environment aimed at creating events to encourage team coordination and performance (for a detailed description, see Kleinman & Serfaty, 1989). The manner by which participants worked through the tutorial was determined by their assigned concept map condition. Upon finishing each module of the tutorial, participants then completed a metacomprehension prediction question. In Stage III, participants completed the computer-based knowledge assessment test, covering both declarative and integrative knowledge, followed by the metacomprehension postdiction question. Participants then completed the card sort task, as the assessment of mental model development. Finally, participants completed a general study survey and were debriefed as to the purpose of the study. The entire experiment lasted approximately 2 to 3 hours. Results indicated that the GCMD condition showed a stronger indication of metacomprehension accuracy, via prediction measures, compared with the other three conditions (control, advanced organizer, and feedback), and, specifically, significantly higher correlations than the other three conditions in declarative knowledge. Self-efficacy measures also indicated that the higher metacomprehension accuracy correlation observed in the GCMD condition was likely the result of the intervention, and not due to differences in self-efficacy in that group of participants. Likewise, the feedback and GCMD conditions led to significantly high correlations for metacomprehension accuracy based on levels of understanding on the declarative knowledge tutorial module (Module 1). The feedback condition also showed similar responses for the integrative knowledge module (Module 2). The advanced organizer, feedback, and GCMD conditions were also found to have significantly high correlation of self-reported postdiction of performance on the knowledge assessment and the actual results of the knowledge assessment results. However, results also indicated that there were no significant findings between the four conditions in mental model assessment and knowledge assessment. Nevertheless, results support the relevance of accurate mental model development in knowledge assessment outcomes. Retrospectively, two opposing factors may have complicated efforts to detect additional differences between groups. From one side, the experimental measures may not have been rigorous enough to filter out the effect from the intervention itself. Conversely, software usability issues and the resulting limitations in experimental design may have worked negatively against the two concept mapping conditions and, inadvertently, suppressed effects of the intervention. Future research in the GCMD approach will likely review cognitive workload, concept mapping software design, and the sensitivity of the measures involved.
|
24 |
Compact Modeling of Silicon Carbide (SiC) Vertical Junction Field Effect Transistor (VJFET) in PSpice using Angelov Model and PSpice Simulation of Analog Circuit Building Blocks using SiC VJFET ModelPurohit, Siddharth 09 December 2006 (has links)
This thesis presents the development of compact model of novel silicon carbide (SiC) Vertical Junction Field Effect Transistor (VJFET) for high-power circuit simulation. An empirical Angelov model is developed for SiC VJFET in PSpice. The model is capable of accurately replicating the device behavior for the DC and Transient conditions. The model was validated against measured data obtained from devices developed by Mississippi Center for Advanced Semiconductor Prototyping at MSU and SemiSouth Laboratories. The modeling approach is based on extracting Angelov Equations Coefficients from experimental device characteristics using non linear fitting. The coefficients are extracted for different parameters (temperature, width, etc). Multi-Dimensional Interpolation Technique is used to incorporate the effect of more than one parameter. The models developed in this research are expected to be valuable tools for electronic designers in the future. The developed model was applied for investigating the characteristics of a few standard analog circuit blocks using SiC VJFET and Si JFET in order to demonstrate the capabilities of the model to reveal the relative advantages of one over the other. The selected circuits of interest were Voltage Follower, Common Source Amplifier, Current Source and Differential Amplifier. Simulations of analog circuit building blocks incorporating SiC VJFET showed better circuit functionality compared to their Si counterparts.
|
25 |
Chemometrics Development using Multivariate Statistics and Vibrational Spectroscopy and its Application to Cancer DiagnosisLi, Ran January 2015 (has links)
No description available.
|
26 |
Full-Vehicle Model Development of a Hybrid Electric VehicleAnd Development of a Controls Testing FrameworkKhanna, Arjun 29 December 2016 (has links)
No description available.
|
27 |
The Role of Interstitial Fluid Flow in the Progression of Glioblastoma and Alzheimer's DiseaseTate, Kinsley 30 November 2022 (has links)
The human brain is a complex organ that is responsible for regulating all the physiological processes in the body, ranging from memory to movement. As humans age, the brain goes through a variety of changes including a reduction in glymphatic waste clearance and increase in glial reactivity. Two neurological conditions that affect individuals over the age of 65 include glioblastoma (GBM) and Alzheimer's disease (AD). Interestingly, patients with GBM do not present with AD and vice versa. Both conditions are characterized by a disruption in interstitial fluid flow (IFF) and an increase in neuroinflammation. Throughout the following dissertation, we examined the role of IFF in AD and GBM progression using a three-sided approach (in vivo, in vitro, and in silico). Increased IFF underlies glioma invasion into the surrounding tumor microenvironment (TME) in GBM. We used a 3D hydrogel model of the GBM TME to examine potential pathways by which astrocytes and microglia contribute to glioma invasion. A reduction in IFF contributes to accumulation of the toxic protein amyloid beta (Aβ) in AD. We sought to create a novel, patient-inspired model of the AD hippocampus for examination of the relationship between IFF and Aβ clearance. Human AD and unaffected control hippocampal brain samples were stained for markers of neurons, astrocytes, microglia and Aβ. The percentage of each cell population in the CA1 region of the hippocampus was calculated. We also analyzed the amount and characteristics of the Aβ aggregates present in this hippocampal region. Pearson correlation analysis was completed to assess the relationships between the various cell populations, Aβ load, and patient descriptors. The cell ratios gleaned from the patient samples were incorporated into a novel, 3D hydrogel model of the AD hippocampus. This model features a hydrogel mixture like the native brain extracellular matrix (ECM) and allows for the application of IFF and Aβ. To our knowledge, we are the first group to create a patient-specific triculture model of the AD hippocampus, which is the main site of Aβ aggregation in the AD brain. We used this model to examine the relationship between IFF-mediated Aβ clearance and glial reactivity. The last aim of this dissertation was to create a computational model for examining Aβ binding within the ECM and the effects of IFF on Aβ clearance. In vitro experiments were conducted to generate 3D renderings of glial cells and to determine relevant parameters for our model. Throughout this work, we discuss the relationship between disruption in IFF and glial reactivity in the context of GBM and AD. / Doctor of Philosophy / The human brain is a complex organ that is responsible for regulating all the physiological processes in the body, ranging from memory to movement. As humans age, the brain goes through a variety of changes including a reduction in brain waste removal and an increase in inflammation. Two neurological conditions that affect individuals over the age of 65 include glioblastoma (GBM) and Alzheimer's disease (AD). Interestingly, patients with GBM do not present with AD and vice versa. Both conditions are characterized by a disruption in brain interstitial fluid flow (IFF) and an increase in neuroinflammation. Throughout the following dissertation, we examined the role of IFF in AD and GBM progression using a three-sided approach including analysis of mouse and human tissues, engineered cell models, and computational methods. Specific interactions between brain cell types and their relationships with glioma invasion were examined using a 3D cell model that mimics the brain. Through the work presented here, we also sought to create a novel cell model of the hippocampus region located in the AD brain. We quantified the various cell types in the hippocampus of AD patient samples and incorporated this information into our hydrogel model. The resulting model features three brain cell types (astrocytes, microglia, and neurons) that are added at patient relevant ratios, a matrix that mimics the native brain scaffold, and allows for the application of IFF. In the AD brain there is a reduction in brain waste removal that leads to accumulation of the toxic protein amyloid beta (Aβ). We were successfully able to incorporate this protein within our model so we could assess the relationship between IFF and Aβ removal from the brain. We further studied this relationship using a new computational model of Aβ accumulation in the brain. Throughout this work, we discuss the connection between disrupted IFF and neuroinflammation in the context of GBM and AD.
|
28 |
Evaluating the educational effectiveness of simulation games: A value generation modelRanchhod, A., Gurau, C., Loukis, E., Trivedi, Rohit 09 November 2013 (has links)
No / This article investigates the relationships between various types of educational value generated by the Markstrat simulation game. Considering several theoretical models of experiential learning and the research framework proposed by previous studies, an educational value generation model is developed and validated, using primary data collected from 305 UK-based students. Four types of educational value are identified: experience generation, conceptual understanding, skills development, and affective evaluation. The application of structural equation modelling indicates several significant relationships: experience generation has a strong impact on conceptual understanding, and both of them have medium to high direct impacts on skills development. On the other hand, the participants’ perception regarding the professional skills developed during the simulation game determines their affective evaluation of the Markstrat exercise. The model presented in this study is generalizable to other simulation games, and to other academic disciplines that implement the same experiential learning approach.
|
29 |
Genetic association of high-dimensional traitsMeyer, Hannah Verena January 2018 (has links)
Over the past ten years, more than 4,000 genome-wide association studies (GWAS) have helped to shed light on the genetic architecture of complex traits and diseases. In recent years, phenotyping of the samples has often gone beyond single traits and it has become common to record multi- to high-dimensional phenotypes for individu- als. Whilst these rich datasets offer the potential to analyse complex trait structures and pleiotropic effects at a genome-wide level, novel analytic challenges arise. This thesis summarises my research into genetic associations for high-dimensional phen- otype data. First, I developed a novel and computationally efficient approach for multivari- ate analysis of high-dimensional phenotypes based on linear mixed models, com- bined with bootstrapping (LiMMBo). Both in simulation studies and on real data, I demonstrate the statistical validity of LiMMBo and that it can scale to hundreds of phenotypes. I show the gain in power of multivariate analyses for high-dimensional phenotypes compared to univariate approaches, and illustrate that LiMMBo allows for detecting pleiotropy in a large number of phenotypic traits. Aside from their computational challenges in GWAS, the true dimensionality of very high-dimensional phenotypes is often unknown and lies hidden in high-dimen- sional space. Retaining maximum power for association studies of such phenotype data relies on using an appropriate phenotype representation. I systematically ana- lysed twelve unsupervised dimensionality reduction methods based on their per- formance in finding a robust phenotype representation in simulated data of different structure and size. I propose a stability criteria for choosing low-dimensional phen- otype representations and demonstrate that stable phenotypes can recover genetic associations. Finally, I analysed genetic variants for associations to high-dimensional cardiac phenotypes based on MRI data from 1,500 healthy individuals. I used an unsuper- vised approach to extract a low-dimensional representation of cardiac wall thickness and conducted a GWAS on this representation. In addition, I investigated genetic associations to a trabeculation phenotype generated from a supervised feature ex- traction approach on the cardiac MRI data. In summary, this thesis highlights and overcomes some of the challenges in per- forming genetic association studies on high-dimensional phenotypes. It describes new approaches for phenotype processing, and genotype to phenotype mapping for high-dimensional datasets, as well as providing new insights in the genetic structure of cardiac morphology in humans.
|
30 |
System Design for DSP Applications with the MASIC MethodologyDeb, Abhijit Kumar January 2004 (has links)
The difficulties of system design are persistentlyincreasing due to the integration of more functionality on asystem, time-to-market pressure, productivity gap, andperformance requirements. To address the system designproblems, design methodologies build system models at higherabstraction level. However, the design task to map an abstractfunctional model on a system architecture is nontrivial becausethe architecture contains a wide variety of system componentsand interconnection topology, and a given functionality can berealized in various ways depending on cost-performancetradeoffs. Therefore, a system design methodology must provideadequate design steps to map the abstract functionality on adetailed architecture. MASICMaths to ASICis a system design methodologytargeting DSP applications. In MASIC, we begin with afunctional model of the system. Next, the architecturaldecisions are captured to map the functionality on the systemarchitecture. We present a systematic approach to classify thearchitectural decisions in two categories: system leveldecisions (SLDs) and implementation level decisions (ILDs). Asa result of this categorization, we only need to consider asubset of the decisions at once. To capture these decisions inan abstract way, we present three transaction level models(TLMs) in the context of DSP systems. These TLMs capture thedesign decisions using abstract transactions where timing ismodeled only to describe the major synchronization events. As aresult the functionality can be mapped to the systemarchitecture without meticulous details. Also, the artifacts ofthe design decisions in terms of delay can be simulatedquickly. Thus the MASIC approach saves both modeling andsimulation time. It also facilitates the reuse of predesignedhardware and software components. To capture and inject the architectural decisionsefficiently, we present the grammar based language of MASIC.This language effectively helps us to implement the stepspertaining to the methodology. A Petri net based simulationtechnique is developed, which avoids the need to compile theMASIC description to VHDL for the sake of simulation. We alsopresent a divide and conquer based approach to verify the MASICmodel of a system. Keywords:System design methodology, Signal processingsystems, Design decision, Communication, Computation, Modeldevelopment, Transaction level model, System design language,Grammar, MASIC.
|
Page generated in 0.1106 seconds