Spelling suggestions: "subject:"line analysis""
1 |
TDRSS Link Budget Design TableMinnix, Timothy, Horan, Stephen 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada / The Consultative Committee for Space Data Systems (CCSDS) has issued a Recommendation CCSDS 401.0-B for Radio Frequency and Modulation Systems to be used in Earth stations and spacecraft. Part of this Recommendation is a standardized design tool for link budget computations. This design tool is intended to assist spacecraft designers in preparing the power and performance designs of their spacecraft for communicating with existing standard ground stations. The present CCSDS Recommendation addresses a link design typical for that found with the Deep Space Network (DSN). DSN link analyses use a large subset of link-specific parameters not of any particular use if the space data link passes through the Tracking and Data Relay Satellite System (TDRSS). The link architecture also differs in that the TDRSS parameter set needs to include an extra link through the satellite (two-hop) link versus a DSN-type link which is single-hop. Conversely, the treatment of ranging, PN coding requirements, and TDRSS acquisition and data group formalities are either not of the same format or not present at all on the DSN-type links. The baseline CCSDS 401 design tool is a Microsoft Excel spreadsheet that can run on an IBM PC or compatible computer. This baseline spreadsheet has been modified to account for the differences between baseline CCSDS model and TDRSS link operations. The paper will discuss the modifications made to the spreadsheet for the TDRSS system details. We will also present example usages of the spreadsheet.
|
2 |
Web manifestations of knowledge-based innovation systems in the UKStuart, David January 2008 (has links)
Innovation is widely recognised as essential to the modern economy. The term knowledgebased innovation system has been used to refer to innovation systems which recognise the importance of an economy’s knowledge base and the efficient interactions between important actors from the different sectors of society. Such interactions are thought to enable greater innovation by the system as a whole. Whilst it may not be possible to fully understand all the complex relationships involved within knowledge-based innovation systems, within the field of informetrics bibliometric methodologies have emerged that allows us to analyse some of the relationships that contribute to the innovation process. However, due to the limitations in traditional bibliometric sources it is important to investigate new potential sources of information. The web is one such source. This thesis documents an investigation into the potential of the web to provide information about knowledge-based innovation systems in the United Kingdom. Within this thesis the link analysis methodologies that have previously been successfully applied to investigations of the academic community (Thelwall, 2004a) are applied to organisations from different sections of society to determine whether link analysis of the web can provide a new source of information about knowledge-based innovation systems in the UK. This study makes the case that data may be collected ethically to provide information about the interconnections between web sites of various different sizes and from within different sectors of society, that there are significant differences in the linking practices of web sites within different sectors, and that reciprocal links provide a better indication of collaboration than uni-directional web links. Most importantly the study shows that the web provides new information about the relationships between organisations, rather than just a repetition of the same information from an alternative source. Whilst the study has shown that there is a lot of potential for the web as a source of information on knowledge-based innovation systems, the same richness that makes it such a potentially useful source makes applications of large scale studies very labour intensive.
|
3 |
Métricas de análise de links e qualidade de conteúdo: um estudo de caso na Wikipédia / Link analysis metrics and content quality: a case of study in WikipediaHanada, Raíza Tamae Sarkis 26 February 2013 (has links)
Muitos links entre páginas na Web podem ser vistos como indicadores de qualidade e importância para as páginas que eles apontam. A partir desta ideia, vários estudos propuseram métricas baseadas na estrutura de links para inferir qualidade de conteúdo em páginas da web. Contudo, até onde sabemos, o único trabalho que examinou a correlação entre tais métricas e qualidade de conteúdo consistiu de um estudo limitado que deixou várias questões em aberto. Embora tais métricas sejam muito bem sucedidas na tarefa de ranquear páginas que foram fornecidas como respostas para consultas submetidas para máquinas de busca, não é possível determinar a contribuição específica de fatores como qualidade, popularidade e importância para os resultados. Esta dificuldade se deve em parte ao fato de que a informação sobre qualidade, popularidade e importância é difícil de obter para páginas da web em geral. Ao contrário de páginas da web, estas informações podem ser obtidas para artigos da Wikipédia, uma vez que qualidade e importância são avaliadas por especialistas humanos, enquanto a popularidade pode ser estimada com base nas visualizações dos artigos. Isso torna possível a verificação da relação existente entre estes fatores e métricas de análise de links, nosso objetivo neste trabalho. Para fazer isto, nós implementamos vários algoritmos de análise de links e comparamos os rankings obtidos com eles com os obtidos considerando a avaliação humana feita na Wikipédia com relação aos fatores qualidade, popularidade e importância. Nós observamos que métricas de análise de links são mais relacionadas com qualidade e popularidade que com importância e a correlação é moderada / Many links between Web pages can be viewed as indicative of the quality and importance of the pages pointed to. Accordingly, several studies have proposed metrics based on links to infer web page content quality. However, as far as we know, the only work that has examined the correlation between such metrics and content quality consisted of a limited study that left many open questions. In spite of these metrics having been shown successful in the task of ranking pages which were provided as answers to queries submitted to search machines, it is not possible to determine the specific contribution of factors such as quality, popularity, and importance to the results. This difficulty is partially due to the fact that such information is hard to obtain for Web pages in general. Unlike ordinary Web pages, the content quality of Wikipedia articles is evaluated by human experts, which makes it feasible to verify the relation between such link analysis metrics and the quality of Wikipedia articles, our goal in this work. To accomplish that, we implemented several link analysis algorithms and compared their resulting rankings with the ones created by human evaluators regarding factors such as quality, popularity and importance. We found that the metrics are more correlated to quality and popularity than to importance, and the correlation is moderate
|
4 |
Missing Link Discovery In Wikipedia: A Comparative StudySunercan, Omer 01 February 2010 (has links) (PDF)
The fast growing online encyclopedia concept presents original and innovative features by taking advantage of information technologies. The links connecting the articles is one of the most important instances of these features. In this thesis, we present our work on discovering missing links in Wikipedia articles. This task is important for both readers and authors of Wikipedia. Readers will bene& / #64257 / t from the increased article quality with better navigation support. On the other hand, the system can be employed to support authors during editing.
This study combines the strengths of different approaches previously applied for the task, and proposes its own techniques to reach satisfactory results. Because of the subjectivity in the nature of the task / automatic evaluation is hard to apply. Comparing approaches seems to be the best method to evaluate new techniques, and we offer a semi-automatized method for evaluation of the results. The recall is calculated automatically using existing links in Wikipedia. The precision is calculated according to manual evaluations of human assessors. Comparative results for different techniques are presented, showing the success of our improvements.
Our system employs Turkish Wikipedia (Vikipedi) and, according to our knowledge, it is the & / #64257 / rst study on it. We aim to exploit the Turkish Wikipedia as a semantic resource to examine whether it is scalable enough for such purposes.
|
5 |
Discovering Issue Networks Using Data Mining TechniquesChuang, Tse-sheng 01 August 2002 (has links)
By means of data mining techniques development these days, the knowledge discovered by virtue of data mining has ranging from business application to fraud detection. However, too often, we see only the profit-making justification for investing in data mining while losing sight of the fact that they can help resolve issues of global or national importance. In this research, we propose the architecture for issue oriented information construction and knowledge discovery that related to political or public policy issues. In this architecture, we adopt issue networks as the description model and data mining as the core technique. This study is also performed and verified with prototype system constructing and case data analyzing.
There are three main topics in our research. The issue networks information construction starts with text files information retrieving of specified issue from news reports. Keywords retrieved from news reports are converted into structuralized network nodes and presented in the form of issue networks. The second topic is the clustering of network actors. We adopt an issue-association clustering method to provide views of clustering of issue participators based on relations of issues. In third topic, we use specified link analysis method to compute the importance of actors and sub-issues.
Our study concludes with performance evaluation via domain experts. We conduct recall, precision evaluation for first topic above and certainty, novelty, utility evaluation for others.
|
6 |
Métricas de análise de links e qualidade de conteúdo: um estudo de caso na Wikipédia / Link analysis metrics and content quality: a case of study in WikipediaRaíza Tamae Sarkis Hanada 26 February 2013 (has links)
Muitos links entre páginas na Web podem ser vistos como indicadores de qualidade e importância para as páginas que eles apontam. A partir desta ideia, vários estudos propuseram métricas baseadas na estrutura de links para inferir qualidade de conteúdo em páginas da web. Contudo, até onde sabemos, o único trabalho que examinou a correlação entre tais métricas e qualidade de conteúdo consistiu de um estudo limitado que deixou várias questões em aberto. Embora tais métricas sejam muito bem sucedidas na tarefa de ranquear páginas que foram fornecidas como respostas para consultas submetidas para máquinas de busca, não é possível determinar a contribuição específica de fatores como qualidade, popularidade e importância para os resultados. Esta dificuldade se deve em parte ao fato de que a informação sobre qualidade, popularidade e importância é difícil de obter para páginas da web em geral. Ao contrário de páginas da web, estas informações podem ser obtidas para artigos da Wikipédia, uma vez que qualidade e importância são avaliadas por especialistas humanos, enquanto a popularidade pode ser estimada com base nas visualizações dos artigos. Isso torna possível a verificação da relação existente entre estes fatores e métricas de análise de links, nosso objetivo neste trabalho. Para fazer isto, nós implementamos vários algoritmos de análise de links e comparamos os rankings obtidos com eles com os obtidos considerando a avaliação humana feita na Wikipédia com relação aos fatores qualidade, popularidade e importância. Nós observamos que métricas de análise de links são mais relacionadas com qualidade e popularidade que com importância e a correlação é moderada / Many links between Web pages can be viewed as indicative of the quality and importance of the pages pointed to. Accordingly, several studies have proposed metrics based on links to infer web page content quality. However, as far as we know, the only work that has examined the correlation between such metrics and content quality consisted of a limited study that left many open questions. In spite of these metrics having been shown successful in the task of ranking pages which were provided as answers to queries submitted to search machines, it is not possible to determine the specific contribution of factors such as quality, popularity, and importance to the results. This difficulty is partially due to the fact that such information is hard to obtain for Web pages in general. Unlike ordinary Web pages, the content quality of Wikipedia articles is evaluated by human experts, which makes it feasible to verify the relation between such link analysis metrics and the quality of Wikipedia articles, our goal in this work. To accomplish that, we implemented several link analysis algorithms and compared their resulting rankings with the ones created by human evaluators regarding factors such as quality, popularity and importance. We found that the metrics are more correlated to quality and popularity than to importance, and the correlation is moderate
|
7 |
EASTERN RANGE TITAN IV/CENTAUR-TDRSS OPERATIONAL COMPATIBILITY TESTINGBocchino, Chris, Hamilton, William 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / The future of range operations in the area of expendable launch vehicle (ELV) support is
unquestionably headed in the direction of space-based rather than land- or air-based assets
for such functions as metric tracking or telemetry data collection. To this end, an effort
was recently completed by the Air Force’s Eastern Range (ER) to certify NASA’s
Tracking and Data Relay Satellite System (TDRSS) as a viable and operational asset to be
used for telemetry coverage during future Titan IV/Centaur launches. The test plan
developed to demonstrate this capability consisted of three parts: 1) a bit error rate test; 2)
a bit-by-bit compare of data recorded via conventional means vice the TDRSS network
while the vehicle was radiating in a fixed position from the pad; and 3) an in-flight
demonstration to ensure positive radio frequency (RF) link and usable data during critical
periods of telemetry collection. The subsequent approval by the Air Force of this approach
allows future launch vehicle contractors a relatively inexpensive and reliable means of
telemetry data collection even when launch trajectories are out of sight of land-based
assets or when land- or aircraft-based assets are not available for support.
|
8 |
ANTENNA PATTERN EVALUATION FOR LINK ANALYSISPedroza, Moises 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / The use of high bit rates in the missile testing environment requires that the receiving
telemetry system(s) have the correct signal margin for no PCM bit errors. This requirement
plus the fact that the use of “redundant systems” are no longer considered optimum
support scenarios has made it necessary to select the minimum number of tracking sites
that will gather the data with the required signal margin. A very basic link analysis can be
made by using the maximum and minimum gain values from the transmitting antenna
pattern. Another way of evaluating the transmitting antenna gain is to base the gain on the
highest percentile appearance of the highest gain value.
This paper discusses the mathematical analysis the WSMR Telemetry Branch uses to
determine the signal margin resulting from a radiating source along a nominal trajectory.
The mathematical analysis calculates the missile aspect angles (Theta, Phi, and Alpha) to
the telemetry tracking system that yields the transmitting antenna gain. The gain is
obtained from the Antenna Radiation Distribution Table (ARDT) that is stored in a
computer file. An entire trajectory can be evaluated for signal margin before an actual
flight. The expected signal strength level can be compared to the actual signal strength
level from the flight. This information can be used to evaluate any plume effects.
|
9 |
Segmentação dos usuários de cartão de crédito por meio da análise de cesto de compras / Segmentation of credit card clients by market basket analysisTavares, Pedro Daniel 17 January 2012 (has links)
Esta dissertação de mestrado tem como objetivo, elaborar um modelo de segmentação baseando-se no comportamento comprovado de consumo de clientes, valendo-se das técnicas de Análise de Associação e Análise de Cesto de Compras, aplicadas aos dados das faturas de cartão de crédito dos clientes. A partir do modelo proposto, testou-se a previsibilidade das próximas transações dos clientes por meio de uma amostra de validação. A motivação desta pesquisa provém de três pilares: Contexto Científico, Tecnológico e Mercadológico. No Contexto Científico, apesar de já terem sido publicados artigos que associam a utilização do cartão de crédito a perfis de segmentação de clientes, não se encontram publicados estudos que relacionam dados da própria utilização do cartão como fonte de informação do cliente. A razão mais provável para isso é a dificuldade no levantamento dos dados fundamentais para este tipo de pesquisa. Com o apoio de uma grande instituição financeira, este trabalho está se tornando viável, sob o preceito da análise apenas sobre bases de clientes anônimos e que não transpareça informações estratégicas da instituição. No contexto tecnológico, com a tecnologia de informação em crescente desenvolvimento, as operações feitas com cartão de crédito tem o processamento on-line em tempo real, promovendo a troca de informação entre o estabelecimento comercial e a instituição emissora do cartão de crédito no momento em que a cobrança é lançada e aceita pelo consumidor final. Isso possibilita que ações promocionais sejam realizadas em toda a cadeia de valor de cartões de crédito, gerando mais valor para os clientes e empresas. No contexto mercadológico, o Brasil apresentou altas taxas de crescimento do mercado de cartões de crédito nas últimas décadas, substituindo os outros meios mais antigos de pagamento e de crediário. Especialmente no Brasil, observam-se compras pagas com o uso do cartão de crédito parceladas com e sem juros, o que contribui para a substituição de outras formas de crédito. Como benefício deste trabalho, concluiu-se que a partir do conhecimento do consumo do cliente, pode-se aplicar a análise de cesto de compras para prever as próximas transações dos clientes, a fim de segmentar os clientes para estimulá-los a aderir a uma determinada oferta. / The objective of this research is elaborating a Segmentation Model based on credit card client\'s behavior using Link Analysis and Market Basket Analysis techniques. The proposed model was used to testing the predictability of next client transactions through validation sample. Scientific, technological and marketing scenarios are the three motivational pillars of this research. On scientific context there were published studies that associate credit card use with segmentation profile of customer. However these studies do not establish relationship between data from own clients credit card utilization. One probably reason for this lack analysis into studies is the difficult collect of fundamental data. This research was feasible with the support of a great Brazilian financial group. On technological context is observed a wide information technology development. Credit cards transactions have on-line processing. This scenario allows exchange information between market and credit card institution at the moment of final client transaction approval. This technology permits that actions be realized along credit card value chain based on transactions that have been made. On marketing context, during the latest decades, Brazil has shown large growth rates on credit card beyond older ways of payment. In Brazil, is observed a wide utilization of credit cards in installment purchases contributing for the replacement of other ways of credits. This research conclude that from the knowledge of client consume profile, using the Market Basket Analysis technique, it is possible to get a forecast of purchase transactions with the objective to stimulate the consumer in accept particular offer.
|
10 |
High-Speed Link Modeling: Analog/Digital Equalization and Modulation TechniquesLee, Keytaek 2012 May 1900 (has links)
High-speed serial input-output (I/O) link has required advanced equalization and modulation techniques to mitigate inter-symbol interference (ISI) caused by multi-Gb/s signaling over band-limited channels. Increasing demands for transceiver power and area complexity has leveraged on-going interest in analog-to-digital converter (ADC) based link, which allows for robust equalization and flexible adaptation to advanced signaling. With diverse options in ISI control techniques, link performance analysis for complicated transceiver architectures is very important. This work presents advanced statistical modeling for ADC-based link, performance comparison of existing modulation and equalization techniques, and proposed hybrid ADC-based receiver that achieves further power saving in digital equalization.
Statistical analysis precisely estimates high-speed link margins at given implementation constrains and low target bit-error-rate (BER), typically ranges from 1e-12 to 1e-15, by applying proper statistical bound of noise and distortion. The proposed statistical ADC-based link modeling utilizes bounded probability density function (PDF) of limited quantization distortion (4-6 bits) through digital feed-forward and decision feedback equalizers (FFE-DFE) to improve low target BER estimation. Based on statistical modeling, this work surveys the impact of insufficient equalization, jitter and crosstalk on modulation selection among two and four level pulse amplitude modulation (PAM-2 and PAM-4, respectively) and duobinary, and ADC resolution reduction performance by partial analog equalizer (PAE).
While the information of channel loss at effective Nyquist frequency and signaling constellation loss initially guides modulation selection, the statistical analysis results show that PAM-4 best tolerates jitter and crosstalk, and duobinary requires the least equalization complexity. Meanwhile, despite robust digital equalization, high-speed ADC complexity and power consumption is still a critical bottleneck, so that PAE is necessitated to reduce ADC resolution requirement. Statistical analysis presents up to 8-bit resolution is required in 12.5Gb/s data communications at 46dB of channel loss without PAE, while 5-bit ADC is enough with 3-tap FFE PAE. For optimal ADC resolution reduction by PAE, digital equalizer complexity also increases to provide enough margin tolerating significant quantization distortion. The proposed hybrid receiver defines unreliable signal thresholds by statistical analysis and selectively takes additional digital equalization to save potentially increasing dynamic power consumption in digital. Simulation results report that the hybrid receiver saves at least 64% of digital equalization power with 3-tap FFE PAE in 12.5Gb/s data rate and up to 46dB loss channels. Finally, this work shows the use of embedded-DFE ADC in the hybrid receiver is limited by error propagation.
|
Page generated in 0.0786 seconds