• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 9
  • 6
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 71
  • 15
  • 13
  • 13
  • 12
  • 11
  • 10
  • 10
  • 7
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

L-DOPA administration shifts the stability-flexibility balance towards attentional capture by distractors during a visual search task

Riedel, P., Domachowska, I. M., Lee, Y., Neukam, P. T., Tönges, L., Li, S. C., Goschke, T., Smolka, M. N. 21 May 2024 (has links)
Rationale The cognitive control dilemma describes the necessity to balance two antagonistic modes of attention: stability and flexibility. Stability refers to goal-directed thought, feeling, or action and flexibility refers to the complementary ability to adapt to an ever-changing environment. Their balance is thought to be maintained by neurotransmitters such as dopamine, most likely in a U-shaped rather than linear manner. However, in humans, studies on the stability-flexibility balance using a dopaminergic agent and/or measurement of brain dopamine are scarce. Objective The study aimed to investigate the causal involvement of dopamine in the stability-flexibility balance and the nature of this relationship in humans. Methods Distractibility was assessed as the difference in reaction time (RT) between distractor and non-distractor trials in a visual search task. In a randomized, placebo-controlled, double-blind, crossover study, 65 healthy participants performed the task under placebo and a dopamine precursor (L-DOPA). Using 18F-DOPA-PET, dopamine availability in the striatum was examined at baseline to investigate its relationship to the RT distractor effect and to the L-DOPA-induced change of the RT distractor effect. Results There was a pronounced RT distractor effect in the placebo session that increased under L-DOPA. Neither the RT distractor effect in the placebo session nor the magnitude of its L-DOPA-induced increase were related to baseline striatal dopamine. Conclusions L-DOPA administration shifted the stability-flexibility balance towards attentional capture by distractors, suggesting causal involvement of dopamine. This finding is consistent with current theories of prefrontal cortex dopamine function. Current data can neither confirm nor falsify the inverted U-shaped function hypothesis with regard to cognitive control.
62

新產品品牌策略對於消費者評價反向延伸之影響-以消費者產品知識與原廠商品牌寬度為調節變數 / The Effects of New Product Branding Strategy on Customer Evaluation of Brand Counterextensions

王馨 Unknown Date (has links)
本研究以原產品與延伸產品之「認知產品類別相似度」為中介變數,探討「品牌延伸」及「品牌傘」兩種新產品品牌策略對於消費者評價反向延伸之影響。反向延伸意指「當原屬於品類A(以下稱原品類)的廠商1(以下稱原廠商)品牌延伸至另一個品類B(以下稱延伸品類)後,原屬於品類B的廠商2(以下稱反向延伸廠商)亦反向延伸至品類A。」研究中並探討「消費者產品知識」與「原廠商品牌寬度」兩者對於新產品品牌策略的調節效果,進而影響消費者認知品類相似度及反向延伸評價。 本研究採2(新產品成功之品牌策略:品牌延伸策略、品牌傘策略)x2(原廠商品牌寬度:窄品牌Nokia、寬品牌LG)x2(消費者產品知識:產品知識高、產品知識低)之實驗設計,並設計無提供先前延伸成功經驗之控制組。根據前測結果,選定手機為原品牌之原產品、數為相機為延伸產品。推出新產品之原手機廠商之窄品牌為Nokia、寬品牌為LG,而數位相機廠商進行反向延伸的品牌則為Canon。 研究結果發現,原產品與延伸產品類別之認知品類相似度為新產品品牌策略與反向延伸評價之中介變數,且當原廠商跨產品類別推出新產品成功,相較於品牌傘策略,品牌延伸策略將使消費者對於認知產品品類相似度提升效果較大、反向延伸之評價也較好。調節效果方面,當消費者產品知識低時,消費者之認知產品類別相似度被提昇的程度較高,且有較佳之反向延伸評價,然其於不同品牌策略間所產生的差異較小;原廠商認知品牌寬度方面,當品牌寬度窄時,消費者之認知產品類別相似度被提昇的程度亦較高,且亦有較佳的反向延伸評價,然品牌寬度較窄的廠商於不同品牌策略間所造成的認知產品類別相似度及反向延伸評價之提升效果差異皆較大。
63

Near-capacity sphere decoder based detection schemes for MIMO wireless communication systems

Kapfunde, Goodwell January 2013 (has links)
The search for the closest lattice point arises in many communication problems, and is known to be NP-hard. The Maximum Likelihood (ML) Detector is the optimal detector which yields an optimal solution to this problem, but at the expense of high computational complexity. Existing near-optimal methods used to solve the problem are based on the Sphere Decoder (SD), which searches for lattice points confined in a hyper-sphere around the received point. The SD has emerged as a powerful means of finding the solution to the ML detection problem for MIMO systems. However the bottleneck lies in the determination of the initial radius. This thesis is concerned with the detection of transmitted wireless signals in Multiple-Input Multiple-Output (MIMO) digital communication systems as efficiently and effectively as possible. The main objective of this thesis is to design efficient ML detection algorithms for MIMO systems based on the depth-first search (DFS) algorithms whilst taking into account complexity and bit error rate performance requirements for advanced digital communication systems. The increased capacity and improved link reliability of MIMO systems without sacrificing bandwidth efficiency and transmit power will serve as the key motivation behind the study of MIMO detection schemes. The fundamental principles behind MIMO systems are explored in Chapter 2. A generic framework for linear and non-linear tree search based detection schemes is then presented Chapter 3. This paves way for different methods of improving the achievable performance-complexity trade-off for all SD-based detection algorithms. The suboptimal detection schemes, in particular the Minimum Mean Squared Error-Successive Interference Cancellation (MMSE-SIC), will also serve as pre-processing as well as comparison techniques whilst channel capacity approaching Low Density Parity Check (LDPC) codes will be employed to evaluate the performance of the proposed SD. Numerical and simulation results show that non-linear detection schemes yield better performance compared to linear detection schemes, however, at the expense of a slight increase in complexity. The first contribution in this thesis is the design of a near ML-achieving SD algorithm for MIMO digital communication systems that reduces the number of search operations within the sphere-constrained search space at reduced detection complexity in Chapter 4. In this design, the distance between the ML estimate and the received signal is used to control the lower and upper bound radii of the proposed SD to prevent NP-complete problems. The detection method is based on the DFS algorithm and the Successive Interference Cancellation (SIC). The SIC ensures that the effects of dominant signals are effectively removed. Simulation results presented in this thesis show that by employing pre-processing detection schemes, the complexity of the proposed SD can be significantly reduced, though at marginal performance penalty. The second contribution is the determination of the initial sphere radius in Chapter 5. The new initial radius proposed in this thesis is based on the variable parameter α which is commonly based on experience and is chosen to ensure that at least a lattice point exists inside the sphere with high probability. Using the variable parameter α, a new noise covariance matrix which incorporates the number of transmit antennas, the energy of the transmitted symbols and the channel matrix is defined. The new covariance matrix is then incorporated into the EMMSE model to generate an improved EMMSE estimate. The EMMSE radius is finally found by computing the distance between the sphere centre and the improved EMMSE estimate. This distance can be fine-tuned by varying the variable parameter α. The beauty of the proposed method is that it reduces the complexity of the preprocessing step of the EMMSE to that of the Zero-Forcing (ZF) detector without significant performance degradation of the SD, particularly at low Signal-to-Noise Ratios (SNR). More specifically, it will be shown through simulation results that using the EMMSE preprocessing step will substantially improve performance whenever the complexity of the tree search is fixed or upper bounded. The final contribution is the design of the LRAD-MMSE-SIC based SD detection scheme which introduces a trade-off between performance and increased computational complexity in Chapter 6. The Lenstra-Lenstra-Lovasz (LLL) algorithm will be utilised to orthogonalise the channel matrix H to a new near orthogonal channel matrix H ̅.The increased computational complexity introduced by the LLL algorithm will be significantly decreased by employing sorted QR decomposition of the transformed channel H ̅ into a unitary matrix and an upper triangular matrix which retains the property of the channel matrix. The SIC algorithm will ensure that the interference due to dominant signals will be minimised while the LDPC will effectively stop the propagation of errors within the entire system. Through simulations, it will be demonstrated that the proposed detector still approaches the ML performance while requiring much lower complexity compared to the conventional SD.
64

Abundance, niche breadth and stress in the centre and at the border of the distribution range. / A macroecological study on abundant and rare tree species. / Häufigkeit, Nischenbreite und Stress im Arealzentrum und am Arealrand. / Eine makroökologische Studie über häufige und seltene Baumarten.

Köckemann, Benjamin 23 September 2008 (has links)
No description available.
65

Polynomial growth of concept lattices, canonical bases and generators:

Junqueira Hadura Albano, Alexandre Luiz 24 July 2017 (has links) (PDF)
We prove that there exist three distinct, comprehensive classes of (formal) contexts with polynomially many concepts. Namely: contexts which are nowhere dense, of bounded breadth or highly convex. Already present in G. Birkhoff's classic monograph is the notion of breadth of a lattice; it equals the number of atoms of a largest boolean suborder. Even though it is natural to define the breadth of a context as being that of its concept lattice, this idea had not been exploited before. We do this and establish many equivalences. Amongst them, it is shown that the breadth of a context equals the size of its largest minimal generator, its largest contranominal-scale subcontext, as well as the Vapnik-Chervonenkis dimension of both its system of extents and of intents. The polynomiality of the aforementioned classes is proven via upper bounds (also known as majorants) for the number of maximal bipartite cliques in bipartite graphs. These are results obtained by various authors in the last decades. The fact that they yield statements about formal contexts is a reward for investigating how two established fields interact, specifically Formal Concept Analysis (FCA) and graph theory. We improve considerably the breadth bound. Such improvement is twofold: besides giving a much tighter expression, we prove that it limits the number of minimal generators. This is strictly more general than upper bounding the quantity of concepts. Indeed, it automatically implies a bound on these, as well as on the number of proper premises. A corollary is that this improved result is a bound for the number of implications in the canonical basis too. With respect to the quantity of concepts, this sharper majorant is shown to be best possible. Such fact is established by constructing contexts whose concept lattices exhibit exactly that many elements. These structures are termed, respectively, extremal contexts and extremal lattices. The usual procedure of taking the standard context allows one to work interchangeably with either one of these two extremal structures. Extremal lattices are equivalently defined as finite lattices which have as many elements as possible, under the condition that they obey two upper limits: one for its number of join-irreducibles, other for its breadth. Subsequently, these structures are characterized in two ways. Our first characterization is done using the lattice perspective. Initially, we construct extremal lattices by the iterated operation of finding smaller, extremal subsemilattices and duplicating their elements. Then, it is shown that every extremal lattice must be obtained through a recursive application of this construction principle. A byproduct of this contribution is that extremal lattices are always meet-distributive. Despite the fact that this approach is revealing, the vicinity of its findings contains unanswered combinatorial questions which are relevant. Most notably, the number of meet-irreducibles of extremal lattices escapes from control when this construction is conducted. Aiming to get a grip on the number of meet-irreducibles, we succeed at proving an alternative characterization of these structures. This second approach is based on implication logic, and exposes an interesting link between number of proper premises, pseudo-extents and concepts. A guiding idea in this scenario is to use implications to construct lattices. It turns out that constructing extremal structures with this method is simpler, in the sense that a recursive application of the construction principle is not needed. Moreover, we obtain with ease a general, explicit formula for the Whitney numbers of extremal lattices. This reveals that they are unimodal, too. Like the first, this second construction method is shown to be characteristic. A particular case of the construction is able to force - with precision - a high number of (in the sense of "exponentially many'') meet-irreducibles. Such occasional explosion of meet-irreducibles motivates a generalization of the notion of extremal lattices. This is done by means of considering a more refined partition of the class of all finite lattices. In this finer-grained setting, each extremal class consists of lattices with bounded breadth, number of join irreducibles and meet-irreducibles as well. The generalized problem of finding the maximum number of concepts reveals itself to be challenging. Instead of attempting to classify these structures completely, we pose questions inspired by Turán's seminal result in extremal combinatorics. Most prominently: do extremal lattices (in this more general sense) have the maximum permitted breadth? We show a general statement in this setting: for every choice of limits (breadth, number of join-irreducibles and meet-irreducibles), we produce some extremal lattice with the maximum permitted breadth. The tools which underpin all the intuitions in this scenario are hypergraphs and exact set covers. In a rather unexpected, but interesting turn of events, we obtain for free a simple and interesting theorem about the general existence of "rich'' subcontexts. Precisely: every context contains an object/attribute pair which, after removed, results in a context with at least half the original number of concepts.
66

O papel das áreas alagáveis nos padrões de diversidade de espécies arbóreas na Amazônia /

Luize, Bruno Garcia January 2019 (has links)
Orientador: Clarisse Palma da Silva / Resumo: Áreas úmidas são ambientes na interface terrestre e aquática, onde sazonalmente a disponibilidade de água pode estar em excesso ou em escassez. A história geológica da bacia amazônica está intimamente relacionada com a presença de áreas úmidas em grandes extensões espaciais e temporais e em variadas tipologias. Dentre as tipologias de áreas úmidas presentes na Amazônia as áreas alagáveis ao longo das planícies de inundação dos grandes rios são possivelmente as que possuem maior extensão territorial. Esta tese aborda o papel das áreas úmidas para a diversidade de árvores na Amazônia. As florestas que crescem em áreas úmidas possuem menor diversidade de espécies arbóreas em relação às florestas em ambientes terrestres (i.e., florestas de terra-firme); possivelmente devido às limitações ecológicas e fisiológicas relacionadas a saturação hídrica do solo e as inundações periódicas. Entretanto, nas áreas úmidas da Amazônia já foram registradas 3,515 espécies de árvores (Capítulo 2), uma quantidade comparável à da diversidade na Floresta Atlântica. Em relação às florestas de terra-firme da Amazônia, as espécies de árvores que ocorrem em áreas úmidas tendem a apresentar maiores áreas de distribuição e amplitudes de tolerâncias de nicho ao longo da região Neotropical (Capítulo 3). A composição florística e a distância filogenética entre espécies arbóreas nas florestas de várzea da Amazônia central mudam amplamente entre localidades (Capítulo 4). O gradiente ambiental contido entre as ... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Wetlands are in the interface of terrestrial and aquatic environments, where seasonally water availability may be in excess or scarcity. Geological history of Amazon basin is closely linked with a huge temporal and spatial extents of wetlands. Nowadays, floodplains (i.e., Vázea and Igapó) are the wetlands with greatest coverage in Amazon. The present thesis is focused on the role of wetlands to tree species diversity in Amazon. Wetland forests have lower tree species diversity than upland forests (i.e., Terra-Firme); most likely due to ecological and physiological limitations. Notwithstanding, in Amazonian wetland forests 3,515 tree species already were recorded, (Chapter 2), which is comparable to tree species diversity in the Atlantic Forest. Wetland tree species show greater ranges sizes and niche breadth compared to tree species do not occur in wetlands (Chapter 3). Floristic compositional turnover and phylogenetic distances between floodplain forests in Central Amazon is high (Chapter 4). The most influential driver of floristic compositional turnover was the geographic distances between localities, whereas phylogenetic distances is driven mainly by the environmental gradients between forests. Furthermore, in general, the most abundant species are those that shows greater co-occurrence associations (Chapter 5). Co-occurrence structure is influenced by biotic interactions like facilitation and competition among species, but also by niche similarities indicated in the evol... (Complete abstract click electronic access below) / Doutor
67

Relations entre lecture et vocabulaire chez des élèves ayant un trouble développemental du langage de 8 à 11 ans

Brossard, Stéphanie 04 1900 (has links)
No description available.
68

Polynomial growth of concept lattices, canonical bases and generators:: extremal set theory in Formal Concept Analysis

Junqueira Hadura Albano, Alexandre Luiz 30 June 2017 (has links)
We prove that there exist three distinct, comprehensive classes of (formal) contexts with polynomially many concepts. Namely: contexts which are nowhere dense, of bounded breadth or highly convex. Already present in G. Birkhoff's classic monograph is the notion of breadth of a lattice; it equals the number of atoms of a largest boolean suborder. Even though it is natural to define the breadth of a context as being that of its concept lattice, this idea had not been exploited before. We do this and establish many equivalences. Amongst them, it is shown that the breadth of a context equals the size of its largest minimal generator, its largest contranominal-scale subcontext, as well as the Vapnik-Chervonenkis dimension of both its system of extents and of intents. The polynomiality of the aforementioned classes is proven via upper bounds (also known as majorants) for the number of maximal bipartite cliques in bipartite graphs. These are results obtained by various authors in the last decades. The fact that they yield statements about formal contexts is a reward for investigating how two established fields interact, specifically Formal Concept Analysis (FCA) and graph theory. We improve considerably the breadth bound. Such improvement is twofold: besides giving a much tighter expression, we prove that it limits the number of minimal generators. This is strictly more general than upper bounding the quantity of concepts. Indeed, it automatically implies a bound on these, as well as on the number of proper premises. A corollary is that this improved result is a bound for the number of implications in the canonical basis too. With respect to the quantity of concepts, this sharper majorant is shown to be best possible. Such fact is established by constructing contexts whose concept lattices exhibit exactly that many elements. These structures are termed, respectively, extremal contexts and extremal lattices. The usual procedure of taking the standard context allows one to work interchangeably with either one of these two extremal structures. Extremal lattices are equivalently defined as finite lattices which have as many elements as possible, under the condition that they obey two upper limits: one for its number of join-irreducibles, other for its breadth. Subsequently, these structures are characterized in two ways. Our first characterization is done using the lattice perspective. Initially, we construct extremal lattices by the iterated operation of finding smaller, extremal subsemilattices and duplicating their elements. Then, it is shown that every extremal lattice must be obtained through a recursive application of this construction principle. A byproduct of this contribution is that extremal lattices are always meet-distributive. Despite the fact that this approach is revealing, the vicinity of its findings contains unanswered combinatorial questions which are relevant. Most notably, the number of meet-irreducibles of extremal lattices escapes from control when this construction is conducted. Aiming to get a grip on the number of meet-irreducibles, we succeed at proving an alternative characterization of these structures. This second approach is based on implication logic, and exposes an interesting link between number of proper premises, pseudo-extents and concepts. A guiding idea in this scenario is to use implications to construct lattices. It turns out that constructing extremal structures with this method is simpler, in the sense that a recursive application of the construction principle is not needed. Moreover, we obtain with ease a general, explicit formula for the Whitney numbers of extremal lattices. This reveals that they are unimodal, too. Like the first, this second construction method is shown to be characteristic. A particular case of the construction is able to force - with precision - a high number of (in the sense of "exponentially many'') meet-irreducibles. Such occasional explosion of meet-irreducibles motivates a generalization of the notion of extremal lattices. This is done by means of considering a more refined partition of the class of all finite lattices. In this finer-grained setting, each extremal class consists of lattices with bounded breadth, number of join irreducibles and meet-irreducibles as well. The generalized problem of finding the maximum number of concepts reveals itself to be challenging. Instead of attempting to classify these structures completely, we pose questions inspired by Turán's seminal result in extremal combinatorics. Most prominently: do extremal lattices (in this more general sense) have the maximum permitted breadth? We show a general statement in this setting: for every choice of limits (breadth, number of join-irreducibles and meet-irreducibles), we produce some extremal lattice with the maximum permitted breadth. The tools which underpin all the intuitions in this scenario are hypergraphs and exact set covers. In a rather unexpected, but interesting turn of events, we obtain for free a simple and interesting theorem about the general existence of "rich'' subcontexts. Precisely: every context contains an object/attribute pair which, after removed, results in a context with at least half the original number of concepts.
69

Hybridization and whole genome duplication as drivers of biological invasions

Mattingly, Kali Z. January 2021 (has links)
No description available.
70

Spider and Beetle Communities across Urban Greenspaces in Cleveland, Ohio: Distributions, Patterns, and Processes

Delgado de la flor, Yvan A. 11 September 2020 (has links)
No description available.

Page generated in 0.0204 seconds