• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • 1
  • Tagged with
  • 9
  • 9
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An online environmental approach to service interaction management in home automation

Wilson, Michael E. J. January 2005 (has links)
Home automation is maturing with the increased deployment of networks and intelligent devices in the home. Along with new protocols and devices, new software services will emerge and work together releasing the full potential of networked consumer devices. Services may include home security, climate control or entertainment. With such extensive interworking the phenomenon known as service interaction, or feature interaction, appears. The problem occurs when services interfere with one another causing unexpected or undesirable outcomes. The main goal of this work is to detect undesired interactions between devices and services while allowing positive interactions between services and devices. If the interaction is negative, the approach should be able to handle it in an appropriate way. Being able to carry out interaction detection in the home poses certain challenges. Firstly, the devices and services are provided by a number of vendors and will be using a variety of protocols. Secondly, the configuration will not be fixed, the network will change as devices join and leave. Services may also change and adapt to user needs and to devices available at runtime. The developed approach is able to work with such challenges. Since the goal of the automated home is to make life simpler for the occupant, the approach should require minimal user intervention. With the above goals, an approach was developed which tackles the problem. Whereas previous approaches solving service interaction have focused on the service, the technique presented here concentrates on the devices and their surrounds, as some interactions occur through conflicting effects on the environment. The approach introduces the concept of environmental variables. A variable may be room temperature, movement or perhaps light. Drawing inspiration from the Operating Systems domain, locks are used to control access to the devices and environmental variables. Using this technique, undesirable interactions are avoided. The inclusion of the environment is a key element of this approach as many interactions can happen indirectly, through the environment. Since the configuration of a home’s devices and services is continually changing, developing an off-line solution is not practical. Therefore, an on-line approach in the form of an interaction manager has been developed. It is the manager’s role to detect interactions. The approach was shown to work successfuly. The manager was able to successfully detect interactions and prevent negative interactions from occurring. Interactions were detected at both device and service level. The approach is flexible: it is protocol independent, services are unaware of the manager, and the manager can cope with new devices and services joining the network. Further, there is little user intervention required for the approach to operate.
2

Investigation of Resources Types for OSLC domains Targeting ISO 26262 : Focus on Knowledge Representation of the Right side of the ISO 26262 Software V-model

Padira, Kathyayani January 2016 (has links)
Context: ISO 26262 requires compilation of traceable work products across the application lifecycle as product based safety evidence. The compilation of such safety evidence is a time consuming and arduous task. Open Services Lifecycle Collaboration (OSLC) is an initiative that supports traceability through tool interoperability. The meta modelling of the ISO 26262 work products in the structure of Resource Description Framework (RDF) can be used for achieving interoperability. Thus, OSLC services used on the RDF exchanged between interoperating tools aids in an effective way of compiling the product based safety evidence for ISO 26262 safety case. Objectives: Representing the compilation of traceable work product types for the software testing and verification in ISO 26262, in form of a RDF-based conceptual meta-model. Testing and extending the concepts by instantiating the meta-model with work products to be represented in RDF for a case of a truck Electronic Control Unit (ECU) system. Lastly, validating the effectiveness of the conceptual meta-model for its compliance to ISO 26262. Methods: To realise the objectives, a case study was conducted at Scania CV AB, Södertälje, Sweden, a manufacturer of safety critical ECU systems used in heavy automobiles. The case study was conducted in three consecutive cycles. The first cycle of qualitative inductive content analysis of the ISO 26262 standard and its related document at the company for defining the conceptual meta model. The second cycle of qualitative deductive content analysis for testing, extending and refining the conceptual meta model. The last cycle of validating the effectiveness of the tested and extended conceptual meta model for compliance to ISO 26262. Results: The main result was the tested, extended and refined RDF based ISO 26262 conceptual meta model depicting traceable work product types for software testing and verification of a safety critical ECU system. The testing and extending of the conceptual meta model was performed with respect to the Main1 (M1) ECU system at Scania. The RDF was defined for the work products of M1 ECU system. Finally, the conceptual meta model was validated for its effectiveness in realising the criteria of abstraction, confirmability and traceability based on ISO 26262.  Conclusions: Thus, the RDF-based conceptual meta-model depicting product based safety evidence provides a structure for realising the traceability required for compiling the software testing and verification part of ISO 26262 safety case. The meta model was tested by defining the RDF for the work products of a truck ECU system that would be exchanged for achieving interoperability. Finally, the conceptual meta-model was validated for representing the knowledge required for showing traceable product based safety evidence for ISO 26262 safety case. / ESPRESSO, Scania CV AB, Södertälje / Gen&ReUsableSafety
3

Information Integration Using aLinked Data Approach : Information Integration Using aLinked Data Approach

Munir, Jawad January 2015 (has links)
Enterprise product, in our case either an embedded system or a software application, development is a complex task, which require model approaches and multiple diverse software tools to be linked together in a tool-chain to support the development process. Individual tools in the tool-chain maintains an incomplete picture of the development process. Data integration is necessary between these various tools in order to have a unified, consistent view of the whole development process. Information integration between these tools is a challenging task due to heterogeneity of these tools. Linked data is a promising approach for tool and data integration, where tools are integrated at data level in a tool-chain. Linked data is an architectural style to integrate and require more definitions and specifications to capture relationships between data. In our case, data of tools are described and shared using OSLC specifications. While such an approach has been widely researched for tool integration, but none covers the aspect of using such a distributed approach for lifecycle data integration, management and search. In this thesis work, we investigated the use of linked data approach to lifecycle data integration. The outcome is a prototype tool-chain architecture for lifecycle data integration, which can support data intensive queries that require information from various data sources in the tool-chain. The report takes Scania´s data integration needs as a case study for the investigation and presents various insights gained during the prototype implementation. Furthermore, the report also presents the key benefits of using a linked data approach for data integration in an enterprise environment. Based on encouraging test results for our prototype, the architecture presented in this report can be seen as a probable solution to lifecycle data integration for the OSLC tool - chain. / Företagets produkt, i vårt fall antingen ett inbyggt system eller ett program, är utvecklingen en komplicerad uppgift, som kräver modell metoder och flera skiftande programverktyg som ska kopplas samman i en verktygskedja för att stödja utvecklingsprocessen. Enskilda verktyg i verktygskedjan bibehåller en ofullständig bild av utvecklingsprocessen. Dataintegration är nödvändigt mellan dessa olika verktyg för att få en enhetlig och konsekvent syn på hela utvecklingsprocessen. Informationsintegration mellan dessa verktyg är en utmanande uppgift på grund av heterogenitet av dessa verktyg. Kopplad data är en lovande strategi för verktygs-och dataintegration, där verktyg är integrerade på datanivå i en verktygskedja. Kopplade uppgifter är en arkitektonisk stil att integrera och kräver fler definitioner och specifikationer för att fånga relationer mellan data. I vårt fall är data av verktyg beskrivna och delad med hjälp av OSLC specifikationer. Medan ett sådant tillvägagångssätt har i stor utsträckning forskats på för integrationsverktyg, men ingen täcker aspekten att använda en sådan distribuerad strategi för livscykeldataintegration, hantering och sökning. I detta examensarbete, undersökte vi användning av länkad data strategi för livscykeldataintegration. Resultatet är en prototyp av verktygskedjans arkitektur för livscykeldataintegration, som kan stödja dataintensiva frågor som kräver information från olika datakällor i verktygskedjan. Rapporten tar Scanias dataintegrationsbehov som en fallstudie för utredning och presenterar olika insikter under genomförandet av prototypen. Vidare presenterar rapporten också de viktigaste fördelarna med att använda en länkad-data-strategi för dataintegration i en företagsmiljö. Baserat på positiva testresultat för vår prototyp, kan arkitekturen presenteras i denna rapport ses som en trolig lösning för livscykeldataintegration för OSLC verktyg - kedja.
4

Push-based low-latency solution for Tracked Resource Set protocol : An extension of Open Services for Lifecycle Collaboration specification

Ning, Xufei January 2017 (has links)
Currently, the development of embedded system requires a variety of software and tools. Moreover, most of this software and tools are standalone applications, thus they are unconnected and their data can be inconsistent and duplicated. This increase both heterogeneity and the complexity of the development environment. To address this situation, tool integration solutions based on Linked Data are used, as they provide scalable and sustainable integration across different engineering tools. Different systems can access and share data by following the Linked-Data-based Open Service for Lifecycle Collaboration (OSLC) specification. OSLC uses the Tracked Resource Set (TRS) protocol to enable a server to expose a resource set and to enable a client to discover a resource in the resource set. Currently, the TRS protocol uses a client pull for the client to update its data and to synchronize with the server. However, this method is inefficient and time consuming. Moreover, high-frequency pulling may introduce an extra burden on the network and server, while low-frequency pulling increases the system’s latency (as seen by the client). A push-based low-latency solution for the TRS protocol was implemented using Message Queue Telemetry Transport (MQTT) technology. The TRS server uses MQTT to push the update patch (called a ChangeEvent) to the TRS client, then the client updates its content according to this ChangeEvent. As a result, the TRS client synchronizes with the TRS server in real time. Furthermore, a TRS adaptor was developed for Atlassian’s JIRA, a widely-used project and issue management tool. This JIRA-TRS adaptor provides a TRS provider with the ability to share data via JIRA with other software or tools which utilize the TRS protocol. In addition, a simulator was developed to simulate the operations in JIRA for a period of time (specifically the create, modify, and delete actions regarding issues) and acts as a validator to check if the data in TRS client matches the data in JIRA. An evaluation of the push-based TRS system shows an average synchronization delay of around 30 milliseconds. This is a huge change compared with original TRS system that synchronized every 60 seconds. / Nuvarande inbyggda system kräver en mängd olika program och verktyg för att stödja dess utveckling. Dessutom är de flesta av dessa programvara och verktyg fristående applikationer. De är oanslutna och deras data kan vara inkonsistent och duplicerad. Detta medför ökad heterogenitet och ökar komplexiteten i utvecklingsmiljön. För att hantera denna situation används verktygsintegrationslösningar baserade på Länkad Data, eftersom de ger en skalbar och hållbar integrationslösning för olika tekniska verktyg. Olika system kan komma åt och dela data genom att följa den Länkad Data-baserade tjänsten Open Service for Lifecycle Collaboration (OSLC). OSLC använder TRS-protokollet (Tracked Resource Set) så att en server kan exponera en resursuppsättning och för att möjliggöra för en klient att upptäcka en resurs i resursuppsättningen. TRS-protokollet använder för tillfället pull-metoden så att klienten kan uppdatera sin data och synkronisera med servern. Denna metod är emellertid ineffektiv och tidskrävande. Vidare kan en högfrekvensdriven pull-metod införa en extra börda på nätverket och servern, medan lågfrekvensdriven ökar systemets latens (som ses av klienten). I det här examensprojektet implementerar vi en pushbaserad låg latenslösning för TRS-protokollet. Den teknik som används är Message Queue Telemetry Transport (MQTT). TRS-servern använder MQTT för att pusha uppdateringspatchen (som kallas ChangeEvent) till TRS-klienten. Därefter uppdaterar klienten dess innehåll enligt denna ChangeEvent. Vilket resulterar i att TRS-klienten synkroniseras med TRS-servern i realtid. Dessutom utvecklas en TRS-adapter för Atlassians JIRA som är ett välanvänt projekt och problemhanteringsverktyg. JIRA-TRS-adaptern tillhandahåller en TRS-leverantör med möjlighet att dela data via JIRA med annan programvara eller verktyg som använder TRS-protokollet. Dessutom utvecklade vi en simulator för att simulera verksamheten i JIRA under en tidsperiod (specifikt skapa, ändra och ta bort åtgärder rörande problem) och en validator för att kontrollera om data i TRS-klienten matchar data i JIRA. En utvärdering av det pushbaserade TRS-systemet visar en genomsnittlig synkroniseringsfördröjning på cirka 30 millisekunder. Detta är en stor förändring jämfört med det ursprungliga TRS-systemet som synkroniseras var 60:e sekund.
5

An approach to automate the adaptor software generation for tool integration in Application/ Product Lifecycle Management tool chains.

Singh, Shikhar January 2016 (has links)
An emerging problem in organisations is that there exist a large number of tools storing data that communicate with each other too often, throughout the process of an application or product development. However, no means of communication without the intervention of a central entity (usually a server) or storing the schema at a central repository exist. Accessing data among tools and linking them is tough and resource intensive. As part of the thesis, we develop a software (also referred to as ‘adaptor’ in the thesis), which, when implemented in the lifecycle management systems, integrates data seamlessly. This will eliminate the need of storing database schemas at a central repository and make the process of accessing data within tools less resource intensive. The adaptor acts as a wrapper to the tools and allows them to directly communicate with each other and exchange data. When using the developed adaptor for communicating data between various tools, the data in relational databases is first converted into RDF format and is then sent or received. Hence, RDF forms the crucial underlying concept on which the software will be based. The Resource description framework (RDF) provides the functionality of data integration irrespective of underlying schemas by treating data as resource and representing it as URIs. The model of RDF is a data model that is used for exchange and communication of data on the Internet and can be used in solving other real world problems like tool integration and automation of communication in relational databases. However, developing this adaptor for every tool requires understanding the individual schemas and structure of each of the tools’ database. This again requires a lot of effort for the developer of the adaptor. So, the main aim of the thesis will be to automate the development of such adaptors. With this automation, the need for anyone to manually assess the database and then develop the adaptor specific to the database is eliminated. Such adaptors and concepts can be used to implement similar solutions in other organisations faced with similar problems. In the end, the output of the thesis is an approachwhich automates the process of generating these adaptors. / Resource Description Framework (RDF) ger funktionaliteten av dataintegration, oberoende av underliggande scheman genom att behandla uppgifter som resurs och representerar det som URI. Modellen för Resource Description Framework är en datamodell som används för utbyte och kommunikation av uppgifter om Internet och kan användas för att lösa andra verkliga problem som integrationsverktyg och automatisering av kommunikation i relationsdatabaser. Ett växande problem i organisationer är att det finns ett stort antal verktyg som lagrar data och som kommunicerar med varandra alltför ofta, under hela processen för ett program eller produktutveckling. Men inga kommunikationsmedel utan ingripande av en central enhet (oftast en server) finns. Åtkomst av data mellan verktyg och länkningar mellan dem är resurskrävande. Som en del av avhandlingen utvecklar vi en programvara (även hänvisad till som "adapter" i avhandlingen), som integrerar data utan större problem. Detta kommer att eliminera behovet av att lagra databasscheman på en central lagringsplats och göra processen för att hämta data inom verktyg mindre resurskrävande. Detta kommer att ske efter beslut om en särskild strategi för att uppnå kommunikation mellan olika verktyg som kan vara en sammanslagning av många relevanta begrepp, genom studier av nya och kommande metoder som kan hjälpa i nämnda scenarier. Med den utvecklade programvaran konverteras först datat i relationsdatabaserna till RDF form och skickas och tas sedan emot i RDF format. Således utgör RDF det viktiga underliggande konceptet för programvaran. Det främsta målet med avhandlingen är att automatisera utvecklingen av ett sådant verktyg (adapter). Med denna automatisering elimineras behovet att av någon manuellt behöver utvärdera databasen och sedan utveckla adaptern enligt databasen. Ett sådant verktyg kan användas för att implementera liknande lösningar i andra organisationer som har liknande problem. Således är resultatet av avhandlingen en algoritm eller ett tillvägagångssätt för att automatisera processen av att skapa adaptern.
6

Efficient service discovery in wide area networks

Brown, Alan January 2008 (has links)
Living in an increasingly networked world, with an abundant number of services available to consumers, the consumer electronics market is enjoying a boom. The average consumer in the developed world may own several networked devices such as games consoles, mobile phones, PDAs, laptops and desktops, wireless picture frames and printers to name but a few. With this growing number of networked devices comes a growing demand for services, defined here as functions requested by a client and provided by a networked node. For example, a client may wish to download and share music or pictures, find and use printer services, or lookup information (e.g. train times, cinema bookings). It is notable that a significant proportion of networked devices are now mobile. Mobile devices introduce a new dynamic to the service discovery problem, such as lower battery and processing power and more expensive bandwidth. Device owners expect to access services not only in their immediate proximity, but further afield (e.g. in their homes and offices). Solving these problems is the focus of this research. This Thesis offers two alternative approaches to service discovery in Wide Area Networks (WANs). Firstly, a unique combination of the Session Initiation Protocol (SIP) and the OSGi middleware technology is presented to provide both mobility and service discovery capability in WANs. Through experimentation, this technique is shown to be successful where the number of operating domains is small, but it does not scale well. To address the issue of scalability, this Thesis proposes the use of Peer-to-Peer (P2P) service overlays as a medium for service discovery in WANs. To confirm that P2P overlays can in fact support service discovery, a technique to utilise the Distributed Hash Table (DHT) functionality of distributed systems is used to store and retrieve service advertisements. Through simulation, this is shown to be both a scalable and a flexible service discovery technique. However, the problems associated with P2P networks with respect to efficiency are well documented. In a novel approach to reduce messaging costs in P2P networks, multi-destination multicast is used. Two well known P2P overlays are extended using the Explicit Multi-Unicast (XCAST) protocol. The resulting analysis of this extension provides a strong argument for multiple P2P maintenance algorithms co-existing in a single P2P overlay to provide adaptable performance. A novel multi-tier P2P overlay system is presented, which is tailored for service rich mobile devices and which provides an efficient platform for service discovery.
7

Investigation of an OSLC-domain targeting ISO 26262 : Focus on the left side of the Software V-model

Castellanos Ardila, Julieth Patricia January 2016 (has links)
Industries have adopted a standardized set of practices for developing their products. In the automotive domain, the provision of safety-compliant systems is guided by ISO 26262, a standard that specifies a set of requirements and recommendations for developing automotive safety-critical systems. For being in compliance with ISO 26262, the safety lifecycle proposed by the standard must be included in the development process of a vehicle. Besides, a safety case that shows that the system is acceptably safe has to be provided. The provision of a safety case implies the execution of a precise documentation process. This process makes sure that the work products are available and traceable. Further, the documentation management is defined in the standard as a mandatory activity and guidelines are proposed/imposed for its elaboration. It would be appropriate to point out that a well-documented safety lifecycle will provide the necessary inputs for the generation of an ISO 26262-compliant safety case. The OSLC (Open Services for Lifecycle Collaboration) standard and the maturing stack of semantic web technologies represent a promising integration platform for enabling semantic interoperability between the tools involved in the safety lifecycle. Tools for requirements, architecture, development management, among others, are expected to interact and shared data with the help of domains specifications created in OSLC.This thesis proposes the creation of an OSLC tool-chain infrastructure for sharing safety-related information, where fragments of safety information can be generated. The steps carried out during the elaboration of this master thesis consist in the identification, representation, and shaping of the RDF resources needed for the creation of a safety case. The focus of the thesis is limited to a tiny portion of the ISO 26262 left-hand side of the V-model, more exactly part 6 clause 8 of the standard:  Software unit design and implementation. Regardless of the use of a restricted portion of the standard during the execution of this thesis, the findings can be extended to other parts, and the conclusions can be generalize.This master thesis is considered one of the first steps towards the provision of an OSLC-based and ISO 26262-compliant methodological approach for representing and shaping the work products resulting from the execution of the safety lifecycle, documentation required in the conformation of an ISO-compliant safety case. / Espresso 2 / Gen&ReuseSafetyCases
8

Développement numérique, territoires et collectivités : vers un modèle ouvert / Digital development, territories and public bodies : towards an open model

Houzet, Sophie 23 April 2013 (has links)
Depuis une vingtaine d'année, les Technologies de l'Information et de la Communication (TIC) se diffusent dans notre quotidien autant que dans les territoires. Leur appropriation massive, combinée à la déréglementation des télécommunications et à l'évolution continue du web pose la question de leurs effets sur les organisations spatiales. L'objet de cette recherche est d'apporter aux territoires une lecture de leur évolution dans les domaines de l'aménagement numérique, de l'innovation dans les services et dans l'appropriation des usages de TIC, ainsi que dans la co-construction de biens publics et la création d'écosystèmes basés sur les valeurs d'une " Société de la Connaissance ". Une première partie est consacrée au contexte très évolutif de la diffusion des TIC dans la société. Par une approche combinant des dimensions législatives, technologiques, économiques et organisationnelles, les enjeux de la Société des Connaissances sont posés au niveau national et en référence aux cadres d'orientation stratégique européens. Une seconde partie analyse la diffusion des TIC dans les territoires. Celle-ci ne relevant pas d'une simple présence ou absence des technologies disponibles, elle rend aussi compte des stratégies commerciales d'opérateurs, des priorités politiques et d'appropriations différenciées des usages. Tous ces éléments n'opèrent pas au même rythme et ne s'articulent pas de manière équivalente. La méthode est basée sur l'analyse des réseaux en trois niveaux, explicitée par G. Dupuy en 1991. La douzaine d'indicateurs pris en compte pour cette analyse et ont été choisis pour leur caractère structurant : déploiement de Réseaux d'Initiative Publique, localisation des entreprises de TIC, mise en réseau d'Espaces Publics Numériques,... Des croisements de ces indicateurs permettent de brosser le paysage de la diffusion spatiale des TIC sur une période de 10 ans (2002- 2012). La complexité induite ne pouvait être explicitée sans adopter une approche systémique pour modéliser l'évolution des territoires dans le temps ainsi que les enjeux auxquels les décideurs sont confrontés. Le socle théorique de référence de la troisième partie est celui de l'approche systémique dont les travaux de Joël de Rosnay dans Le Macroscope (1975), ont été précurseurs de l'évolution de la société avec les TIC. L'objectif est d'apporter des clés de lecture aux décideurs et de susciter une approche globale de l'évolution de leurs territoires dans une Société des Connaissances. A l'heure où le web évolue vers le web sémantique, où les modèles de création et diffusion de l'énergie évoluent vers des modèles distribués, un changement de modèle dans les réseaux de télécommunication est également possible : d'un modèle dominant " intégré " orienté opérateur, vers un modèle transversal, " ouvert ", orienté utilisateur. Les choix d'aujourd'hui qui engageront les territoires de demain sont explicités par la modélisation systémique. / Over the past twenty years or so, Information and Communications Technology (ICT) has spread into our everyday lives and throughout our land. Its mass take-up, combined with the deregulation of telecommunications and the continuous development of the Internet, raise the issue of its impact on spatial and territorial organisation. The object of this research is to offer particular areas an interpretation of what has changed for them in the fields of digital planning, service innovation and the take-up of ICT facilities, as well as in the joint construction of public goods and the creation of ecosystems based on the values of the knowledge society. The first part is devoted to the fast-moving context of the spread of ICT throughout society. Employing an approach that combines legislative, technological, economic and organisational dimensions, the challenges of the knowledge society are examined at a national level and with reference to European strategic policy frameworks. The second part analyses the geographical distribution of ICT, which does not equate merely to whether the technologies available are present or absent. It also addresses operators' commercial strategies, political priorities and the differential take-up of use. Not all these factors operate at the same pace or are structured in the same way. The methodological basis is a three-tier analysis of networks, presented by G. Dupuy in 1991. The dozen indicators that contribute to this analysis were selected for their structural character: roll-out of alternative PPP broadband networks, location of ICT businesses, networking of Digital Public Spaces, deployment of e-administration service platforms, pooling of geographical data, digital sector leadership, access to public data etc. The intersections between these indicators have allowed a landscape to be painted of the spatial distribution of ICT within France over time, spanning a 10-year period between 2002 and 2012. The complexity generated could not be explained in detail without adopting a systemic approach so as to model the development of different areas over time and without citing the challenges that face decision-makers. The theoretical benchmark used as the basis for part three arose from a systemic approach, for which the work of Joël de Rosnay in "Le Macroscope" (1975) was a precursor as regards the development of an ICT-equipped society. The objective is to provide decision-makers with keys to assist comprehension as well as to encourage a global approach to the development of their areas under the knowledge society. At a time when the Internet is evolving towards a semantic web, whereby energy creation and distribution models are becoming distributed models, it is possible to change the model of telecoms networks: from the dominant, operator-oriented "integrated" model towards a more horizontal, user-oriented "open" model. A systems modelling approach explains today's choices which will affect the territories of tomorrow.
9

Efeitos de circulação do discurso em serviços substitutivos de saúde mental: uma perspectiva pscanalítica

Kyrillos Neto, Fuad 19 March 2007 (has links)
Made available in DSpace on 2016-04-29T13:31:29Z (GMT). No. of bitstreams: 1 Fuad Kyrillos Neto.pdf: 1025543 bytes, checksum: c5a50c40e19f68d3a263c9ec785ad132 (MD5) Previous issue date: 2007-03-19 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This study discuss the circulation of the knowledge related to the notion of social inclusion in the mental health open services set on the policy established by the federal government. In order to do it, the text presents the foucault s theory about madness used by the Italian Democratic Psychiatry, as well as Franco Basaglia s theory. The basaglia s model brought some contributions to the mental health s field because it pointed the relevance of the psychiatric patients s social insertion. However, there are some critiques to the citizen s conception used in the implementation of this model and to the kind of the social bow requested by him. Those critiques are made from these theories: the Lacan s subject concept, the psychoses s psychoanalytical concept and the Lacan s theory of the four speeches. With the psychoanalysis as a conducting axle, we dialogue with thinkers of other fields of knowledge, specially those who are interest in the ideology and in the symbolic power. Based on small parts of the open services s quotidian, as well as the clinical cases s parts, this study review the social alliance produced in the mental health s substitutive services. It also points out that the workers play the role of the masters in the relation with the users while they have as a significant master the imperative of the social inclusion. Finally, this study considers some proposals of psychoanalysis field s authors, confirming this basic requirements for a consistent treatment and for a real inclusion of the psychotic one: to include the notion of structure in the diagnosis elaboration, to include the concept of foreclosure in the semiology and in the psychiatric reform institutional service s net and to include the the unconscious s subject psychoanalitical concept and the premise of the subject s participation in the treatment / O presente estudo discute a circulação do saber relacionado à noção e inclusão social nos serviços abertos de saúde mental a partir das diretrizes estabelecidas pelo governo federal. Para tanto, apresenta os pressupostos foucaultianos acerca da loucura utilizados pela Psiquiatria Democrática Italiana, bem como os pressupostos teóricos da obra de Franco Basaglia. Admite-se que existem contribuições positivas trazidas pelo modelo basagliano para o campo da saúde mental, já que apontam a relevância da inserção social dos pacientes psiquiátricos. Porém, formulam-se críticas à concepção de sujeito utilizada na implementação desse modelo e também ao tipo de laço social por ele implicado. Isso é feito a partir de considerações teóricas fundamentadas na noção lacaniana de sujeito, na concepção psicanalítica das psicoses e na teoria lacaniana dos quatro discursos. Ao ter a psicanálise como eixo condutor, dialogamos com pensadores de outros campos do saber, principalmente aqueles que se interessam pela ideologia e pelo poder simbólico. Partindo de fragmentos do cotidiano nos serviços abertos, bem como de fragmentos de casos clínicos, o estudo procede a crítica do laço social produzido nos serviços substitutivos de saúde mental e aponta o fato de que os trabalhadores ocupam o lugar de mestria na relação com os usuários ao ter como significante mestre o imperativo da inclusão social. Finalmente, no estudo, são discutidas proposições de autores do campo da psicanálise, que endossam os seguintes requisitos fundamentais para um tratamento consistente e para uma inclusão efetiva do psicótico: incluir a noção de estrutura na elaboração do diagnóstico, incluir o conceito de foraclusão na semiologia e na própria estrutura da rede de serviços institucionais da reforma psiquiátrica, incluir o conceito psicanalítico de sujeito do inconsciente e a premissa de implicação do sujeito no tratamento

Page generated in 0.45 seconds