271 |
Utveckling av konsultkompetenssystem : Implementering av en kompetensdatabas för ett konsultbolag / Developing a consultant skill system : Implementation of a skills database for a consulting companyÖh, Rickard January 2011 (has links)
Syftet med detta examensarbete är att utreda/undersöka om det är möjligt att lagra och presentera medarbetares kompetens inom ett kunskapsintensivt företag. Initiativtagare till examensarbetet är konsultbolaget Nethouse Sverige AB. Allteftersom Nethouses medarbetarantal växer, blir det svårare att på ett bra sätt strukturera och sammanställa kompetensen och kapaciteten som finns inom företaget. Det blir också svårare att hålla koll på inom vilka områden enskilda medarbetare utvecklas. Svårt att sammanfatta kompetensen inom (medarbetare) och utanför (arbetssökande) Nethouse. Uppdatering av konsultprofilen (Word-dokumentet) glöms lätt bort. Svårt att söka efter medarbetare/arbetssökande med vissa kompetenser. För att på bästa sätt lösa problemen har en förstudie inom kravhantering genomförts. Förstudien resulterade i ett antal kravinsamlingsmetoder som sedan användes för att på bästa sätt extrahera de krav som fanns på ett kompetenssystem. Kraven uppfylldes genom att implementera en webbapplikation där varje medarbetare på Nethouse ges möjligheten att skapa en egen profil. En profil innehåller bland annat de uppdrag, kompetenser och roller som medarbetaren har erfarenhet av. Denna information är även sökbar och möjlighet finns att exportera profilen. Systemet kommer att spara Nethouses medarbetare mycket tid vid kompetens-sammanställning och sökning. / The purpose of this thesis is to examine / investigate the possibility to store and present employees’ skills and competencies in knowledge-intensive company. The initiator of this thesis is the consultant firm Nethouse Sverige AB. As Nethouse employee number grows, it becomes more and more difficult to structure and manage the skills and competencies that exist within Nethouse. It also becomes harder to keep track of the areas in which individual employees develop. Today each employee enters their skills and competencies in a Word-document (one per employee) called consultant profile. This way of entering skills and competencies causes several problems: Difficult to summarize the competence inside (employees) and also outside (jobseekers) Nethouse. Updating the consultant profile is often forgotten. Difficult to search for employees/jobseekers with specific skills. In order to solve the problems in a structured manner a preliminary study in requirements management has been performed. The study resulted in a number of requirement gathering techniques which was used to extract the requirements of the system to be developed. The requirements were realized by implementing a web application in which every employee at Nethouse could create their own consultant profile. A profile includes the assignments, competencies and roles that the employee have experience in. This information was also made searchable and the profile can be exported to Word-format. The system will save Nethouse’s employees a lot of time when searching for competencies and when compiling lists of competencies that exist within the company.
|
272 |
SurfaLätt - .NET och COM baserad webbläsare för personer med vissa funktionsnedsättningarPersson, Martin January 2012 (has links)
Personer med vissa kognitiva funktionsnedsättningar har ibland svårt att använda en vanlig webbläsare. Detta gör internet mindre tillgängligt för denna grupp. Med hjälp av en anpassad webbläsare med förenklat gränssnitt och funktionalitet finns förhoppning om att användarvänlighetsbarriärer för internetanvändning hos dessa personer skall minska och att användningen skall kunna vara mer självständig. Som teknisk målplattform används Microsoft Windows med Internet Explorer och utvecklingen genomfördes inom .NET med C#. Ett mål med den tekniska implementationen var att till så stor del som möjligt utnyttja redan installerad programvara på målsystemen och därmed minimera behovet av programuppdateringar, vilket skulle göra programmet mer framtidssäkert, samt minimera mängden administration av programmet. Tester genomförda på Västmanlands läns landstings Datatek på Handikappcentrum i Västerås gav positivt resultat. Användarna kunde fokusera på de webbsidor de besökte och personalen kunde snabbt ställa in programmet för användaren. / People with certain cognitive disabilities have sometimes difficulties using a standard web browser. This makes the Internet less accessible to this group. With the help of a custom web browser with simplified user interface and features there is hope that the usability barrier around internet usage can be lowered and allowing this group of user to use the Internet in a more independent manner. Microsoft Windows with Internet Explorer has been used as technical platform and the development was carried out using .NET with C #. One of the aims of the technical implementation was to use already installed software on the target system at the largest degree possible, thus minimizing the need for software updates, which makes the application more future-proof, and also minimizing the amount of administration needed. Tests conducted at the Västmanland County Council Datatek on Handikappcentrum in Västerås gave positive results. Users could focus on the websites they visited, and staff could quickly set up the application for the user.
|
273 |
Optimering av prestanda och utnyttjande av flertrådsteknik / Optimization of performance and utilization of multithreadingGustafsson, Daniel, Öberg, Christoffer January 2009 (has links)
The purpose of this thesis was to help the company Medius AB with optimization of selected parts in an existing system to minimize the execution time by implementing multithreading. The idea was to manipulate the code so that calculations could be executed at the same time. The main work in this thesis consisted of three optimizations. The first one was to reconstruct a big “for-loop” so it would execute every loop's work in an own thread. The second optimization also was a reconstruct of a “for-loop” so it could execute the work in different threads. The third and last optimization consisted of reconstructing a stored procedure on the database, so different parts of it each could be executed in an own thread to create data at the same time. To implement the optimizations Visual Basic .NET and its support for multithreading and connection-pools was used. The result of the optimizations meant that one of the modules could be executed more efficiently and it was 12 % faster. The other modules execution time was more efficient by 25 % faster. The meanings and requirements for this thesis are fulfilled by now letting the system make use of its resources in a better way.
|
274 |
Utvärdering av stödet för utveckling av Web Services i J2EE och .NETNilsson, Hmapus January 2005 (has links)
För att underlätta program-till-programkommunikation har SOA (Service Oriented Architecture) utvecklats. SOA har som mål att bidra med riktlinjer för att det ska bli lättare för två applikationer att kommunicera med varandra (Samtani & Sadhwani 2002a). Web Services är en standard som bygger på SOA-arkitekturen och det som gör Web Services speciellt är att tekniken ska vara plattforms- och språkoberoende (Clabby, 2003). Sun och Microsoft har utvecklat varsin plattform för utveckling av Web Services-applikationer. I denna rapport genomförs en litteraturstudie för att undersöka vilket stöd plattformarna har för utveckling av Web Services och för att ta reda på vilka skillnader som finns mellan plattformarna. Skillnaderna undersöks sedan för att se om de kan leda till problem då en användare vill integrera Web Services skrivna i olika plattformar. Resultatet i denna rapport visar att de båda plattformarna har ett bra stöd för att utveckla Web Service-applikationer. Båda plattformarna tillhandahåller flera olika funktioner och klasser för att underlätta för utvecklaren. Jämförelsen mellan plattformarna visar att det är de plattformsspecifika datatyperna som kan ställa till problem vid integrering av Web Services skrivna på olika plattformar.
|
275 |
ELASTIC NET FOR CHANNEL ESTIMATION IN MASSIVE MIMOPeken, Ture, Tandon, Ravi, Bose, Tamal 10 1900 (has links)
Next generation wireless systems will support higher data rates, improved spectral efficiency, and less latency. Massive multiple-input multiple-output (MIMO) is proposed to satisfy these demands. In massive MIMO, many benefits come from employing hundreds of antennas at the base station (BS) and serving dozens of user terminals (UTs) per cell. As the number of antennas increases at the BS, the channel becomes sparse. By exploiting sparse channel in massive MIMO, compressive sensing (CS) methods can be implemented to estimate the channel. In CS methods, the length of pilot sequences can be shortened compared to pilot-based methods. In this paper, a novel channel estimation algorithm based on a CS method called elastic net is proposed. Channel estimation accuracy of pilot-based, lasso, and elastic-net based methods in massive MIMO are compared. It is shown that the elastic-net based method gives the best performance in terms of error for the less pilot symbols and SNR values.
|
276 |
SPÅRA Fisk 1.0 : En webbapplikation för att hantera fisk- och vattenbrukspartier somomfattas av spårbarhetskraven inom fiskbranschen.Hennings, Gabriella January 2016 (has links)
Det här projektet handlar om arbetet med att ta fram SPÅRA Fisk, en webbapplikation som är skapad för att hantera fisk-och vattenbrukspartier. SPÅRA Fisk skall underlätta partihanteringen för företag inom fiskbranschen, vilket innefattar nyregistrering av partier, skapande av produktionspartier, registrering av försäljning och förädling samt terminering. Produkten är framtagen i syfte att underlätta hanteringsprocessen för företag inom fiskbranschen som verkar inom EU och därmed omfattas av spårbarhetskraven som träder i kraft under slutet av 2016. Spårbarhetskraven innebär att i stort sett all handel och förädling med fisk- samt vattenbruksprodukter måste rapporteras in till Hav & Vatten myndigheten. Applikationen är skapad med responsiv design och fungerar både på dator och mobil, även om mobilversionen har mer att önska inför framtiden. Applikationen är skriven i ASP.NET och kommunicerar med en databas genom en Web Service.
|
277 |
Location Optimization of Dairy ProcessingReecy, Michael January 1900 (has links)
Master of Agribusiness / Department of Agricultural Economics / Jason S. Bergtold / Location optimization of a new dairy processing plant is crucial given the significant capital investment of $350 million required to build the plant. Couple this with notable differences in milk and transportation costs due to location, an examination of historical Net Present Value (NPV) of Earnings Before Interest, Taxes, Depreciation and Amortization (EBITDA) adjusted by a discount rate of 3% is warranted to help determine the most optimal location for a new dairy processing plant investment. This thesis is an examination of historical EBITDA NPV for three locations: Dumas, TX, Sioux Falls, SD, and Lansing, MI in an effort to predict the optimal location of a future dairy processing plant. These locations were chosen due to each having the necessary milk supply that would both encourage milk production and support increases in dairy processing. Prices dairy processors receive for cheese can fluctuate but are not tied to the location in which the cheese is produced. Transportation costs of the cheese are determined by the distance to the processing plant from Plymouth, WI, which is where most further cheese processing takes place. Therefore, this thesis includes a sensitivity analysis for the Lansing, MI location to determine a breakeven milk cost and cheddar cheese price.
The NPV was positive for the Dumas, TX location at $100 million as compared to (-$820) million and (-$247) million at the Sioux Falls, SD and Lansing, MI locations, respectively. The results indicate an emerging EBITDA NPV trend favoring the Lansing, MI location as indicated by this location having the best performance in the last two years (2016-2017) of $104 million compared to a negative performance at both of the other locations. The previous 8 years performance would favor the Dumas, TX location, however more weight was given to the past 2 years performance as an indicator for future economic returns. As a result, this thesis concludes the Lansing, MI location as the most favorable location for a new dairy processing investment.
|
278 |
Inscripciones intermedias, instancias mediales.Cruz Valenzuela, Daniel January 2004 (has links)
No description available.
|
279 |
Regularized Markov Model for Modeling Disease TransitioningHuang, Shuang, Huang, Shuang January 2017 (has links)
In longitudinal studies of chronic diseases, the disease states of individuals are often collected at several pre-scheduled clinical visits, but the exact states and the times of transitioning from one state to another between observations are not observed. This is commonly referred to as "panel data". Statistical challenges arise in panel data in regard to identifying predictors governing the transitions between different disease states with only the partially observed disease history. Continuous-time Markov models (CTMMs) are commonly used to analyze panel data, and allow maximum likelihood estimations without making any assumptions about the unobserved states and transition times. By assuming that the underlying disease process is Markovian, CTMMs yield tractable likelihood. However, CTMMs generally allow covariate effect to differ for different transitions, resulting in a much higher number of coefficients to be estimated than the number of covariates, and model overfitting can easily happen in practice. In three papers, I develop a regularized CTMM using the elastic net penalty for panel data, and implement it in an R package. The proposed method is capable of simultaneous variable selection and estimation even when the dimension of the covariates is high.
In the first paper (Section 2), I use elastic net penalty to regularize the CTMM, and derive an efficient coordinate descent algorithm to solve the corresponding optimization problem. The algorithm takes advantage of the multinomial state distribution under the non-informative observation scheme assumption to simplify computation of key quantities. Simulation study shows that this method can effectively select true non-zero predictors while reducing model size.
In the second paper (Section 3), I extend the regularized CTMM developed in the previous paper to accommodate exact death times and censored states. Death is commonly included as an endpoint in longitudinal studies, and exact time of death can be easily obtained but the state path leading to death is usually unknown. I show that exact death times result in a very different form of likelihood, and the dependency of death time on the model requires significantly different numerical methods for computing the derivatives of the log likelihood, a key quantity for the coordinate descent algorithm. I propose to use numerical differentiation to compute the derivatives of the log likelihood. Computation of the derivatives of the log likelihood from a transition involving a censored state is also discussed. I carry out a simulation study to evaluate the performance of this extension, which shows consistently good variable selection properties and comparable prediction accuracy compared to the oracle models where only true non-zero coefficient are fitted. I then apply the regularized CTMM to the airflow limitation data to the TESAOD (The Tucson Epidemiological Study of Airway Obstructive Disease) study with exact death times and censored states, and obtain a prediction model with great size reduction from a total of 220 potential parameters.
Methods developed in the first two papers are implemented in an R package markovnet, and a detailed introduction to the key functionalities of the package is demonstrated with a simulated data set in the third paper (Section 4). Finally, some conclusion remarks are given and directions to future work are discussed (Section 5).
The outline for this dissertation is as follows. Section 1 presents an in-depth background regarding panel data, CTMMs, and penalized regression methods, as well as an brief description of the TESAOD study design. Section 2 describes the first paper entitled "Regularized continuous-time Markov model via elastic net'". Section 3 describes the second paper entitled "Regularized continuous-time Markov model with exact death times and censored states"'. Section 4 describes the third paper "Regularized continuous-time Markov model for panel data: the markovnet package for R"'. Section 5 gives an overall summary and a discussion of future work.
|
280 |
Maximum net power output from an integrated design of a small-scale open and direct solar thermal Brayton cycleLe Roux, Willem Gabriel 22 September 2011 (has links)
The geometry of the receiver and recuperator in a small-scale open and direct recuperative solar thermal Brayton cycle can be optimised in such a way that the system produces maximum net power output. The purpose of this work was to apply the second law of thermodynamics and entropy generation minimisation to optimise these geometries using an optimisation method. The dynamic trajectory optimisation method was used and off-the-shelf micro-turbines and a range of parabolic dish concentrator diameters were considered. A modified cavity receiver was used in the analysis with an assumed cavity wall construction method of either a circular tube or a rectangular channel. A maximum temperature constraint of 1 200 K was set for the receiver surface temperature. A counterflow plate-type recuperator was considered and the recuperator length was constrained to the length of the radius of the concentrator. Systems producing a steady-state net power output of 2 – 100 kW were analysed. The effect of various conditions, such as wind, receiver inclination and concentrator rim angle on the maximum net power output, and optimum geometry of the system were investigated. Forty-five different micro-turbines and seven concentrator diameters between 6 and 18 metres were considered. Results show the optimum geometries, optimum operating conditions and minimum entropy generation as a function of the system mass flow rate. The optimum receiver tube diameter was relatively large when compared with the receiver size. The optimum counterflow plate-type recuperator channel aspect ratio is a linear function of the optimum system mass flow rate for a constant recuperator height. The optimum recuperator length and optimum NTU are small at small system mass flow rates but increase as the system mass flow rate increases until the length constraint is reached. For the optimised systems with maximum net power output, the solar receiver is the main contributor to the total rate of minimum entropy generation. The contributions from the recuperator, compressor and turbine are next in line. Results show that the irreversibilities were spread throughout the system in such a way that the minimum internal irreversibility rate was almost three times the minimum external irreversibility rate for all optimum system geometries and for different concentrator diameters. For a specific environment and parameters, there exists an optimum receiver and recuperator geometry so that the system can produce maximum net power output. / Dissertation (MEng)--University of Pretoria, 2011. / Mechanical and Aeronautical Engineering / unrestricted
|
Page generated in 0.0619 seconds