• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 10
  • 10
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Den nya IT-bubblan : En studie om journaliststudenter och deras sökvanor på nätet

Dahlgren, Gustav, Dahlqvist, Olle January 2012 (has links)
The web is getting more and more characterized by personalization. Big socialnetworks like Facebook as well as the leading search engine Google increasingly usepersonalization algorithms to tailor the information that they present to users. All inorder to make the information more relevant and engaging for the end consumer. Howdoes this personalization affect journalists who increasingly search the web as a partof their journalistic research? In this essay we have looked at the effects thatpersonalization has on the journalists of tomorrow by conducting a survey amongstudents of journalism. We have also done a study of literature and theories to try anddetermine what consequences personalization will have on the internet in the future.We make an in depth study of the search engine Google as this is one of the mainsources of information for journalists and we have tried to tie this to theories of filter-bubbles and gatekeeping. We find that the question is in need of further studies toreally determine the threat that we face but conclude that information onpersonalization should be far more evident when it occurs.
2

Online Information Search, Market Fundamentals and Apartment Real Estate

Das, Prashant 20 December 2013 (has links)
Using a system of multi-step equations, I examine the association between online rental searches and fundamental apartment real estate market variables namely, vacancy rates, rental rates and real estate asset price returns. I find that consumer real estate searches are significantly associated with the market fundamentals after controlling for known determinants of these variables. In particular, I show that apartment rentals related online searches are endogenously and contemporaneously associated with reduced vacancy rate. However, the association between the searches and rental rates is not significantly detected. The searches are contemporaneously associated with positive return on the appraised values of multifamily assets. There is some evidence that the searches are fundamentally associated with REIT returns in the short run and that REIT investors watch the online search trends to inform their stock pricing decisions.
3

„Google” paieškos „AdWords” skelbimų efektyvumo vertinimas / Assessing the effectiveness of „Google“ search engine „AdWords” ads

Lukošiūtė, Edita 05 June 2014 (has links)
Darbe yra analizuojamas „Google“ paieškos „AdWords“ skelbimų efektyvumas. Dėl didėjančios informacinių technologijų svarbos, dauguma vartotojų ieškosi informacijos apie produktus ir paslaugas bei perka juos internetu. Todėl reklamos kūrėjams tampa ypač aktualus efektyvių paieškos sistemos skelbimų kūrimo klausimas. Darbo tikslas – remiantis moksliniais reklamos efektyvumo vertinimo šaltiniais ir „Google“ paieškos „AdWords“ skelbimų efektyvumo vertinimo rezultatais, pateikti bendrą „AdWords“ skelbimų efektyvumo vertinimo ir skelbimų efektyvumo didinimo modelius. Magistro baigiamąjį darbą sudaro trys pagrindinės dalys. Pirmojoje, teorinėje dalyje, yra analizuojami skirtingų autorių požiūriai į reklamos efektyvumo vertinimą, yra pateikiami „Google“ paieškos „AdWords“ skelbimų vertinimo ypatumai. Analitinėje baigiamojo darbo dalyje yra atliekamas pasirinktų „Google“ paieškos „AdWords“ reklaminių kampanijų skelbimų efektyvumo vertinimas. Trečiojoje dalyje yra pateikiamas „Google“ paieškos „AdWords“ skelbimų efektyvumo vertinimo ir efektyvumo didinimo modeliai bei pateikiamos skelbimų efektyvumo didinimo priemonės. / The paper analyzes the effectiveness of „Google“ search engine „AdWords“ ads. Due to the growing importance of information technology , most users search for information about products and services and buy them online. Therefore the issue of ads effectiveness has become extremely important to every advertiser. The objective of this paper is to present general model of „Google“ search engine „AdWords“ ads effectiveness measurement and opportunities of ads effectiveness improvement, scientific resources of advertising effectiveness assessment and results of case analysis. Master's thesis consists of three main parts. In the first, theoretical part of the work introduces analysis, comparison of different authors approaches to the advertising activities performance by finding out methods and indicators, which assess the effectiveness of the ads. The analytical part is based on case study which analyzes effectiveness of particular „Google“ search engine „AdWords“ campaigns. The third part is intended for the model, which shows how to measure effectiveness of the „Google“ search engine „AdWords“ ads, and oppurtinities of ads effectiveness improvement.
4

Search Engine Optimization and the connection with Knowledge Graphs

Marshall, Oliver January 2021 (has links)
Aim: The aim of this study is to analyze the usage of Search Engine Optimization and Knowledge Graphs and the connection between them to achieve profitable business visibility and reach. Methods: Following a qualitative method together with an inductive approach, ten marketing professionals were interviewed via an online questionnaire. To conduct this study both primary and secondary data was utilized. Scientific theory together with empirical findings were linked and discussed in the analysis chapter. Findings: This study establishes current Search Engine Optimization utilization by businesses regarding common techniques and methods. We demonstrate their effectiveness on the Google Knowledge Graph, Google My Business and resulting positive business impact for increased visibility and reach. Difficulties remain in accurate tracking procedures to analyze quantifiable results. Contribution of the thesis: This study contributes to the literature of both Search Engine Optimization and Knowledge Graphs by providing a new perspective on how these subjects have been utilized in modern marketing. In addition, this study provides an understanding of the benefits of SEO utilization on Knowledge Graphs. Suggestions for further research: We suggest more extensive investigation on the elements and utilization of Knowledge Graphs; how the structure can be affected; which techniques are most effective on a bigger scale and how effectively the benefits can be measured. Key Words: Search Engine, Search Engine Optimization, SEO, Knowledge Graphs, Google My Business, Google Search Engine, Online Marketing.
5

Building backlinks with Web 2.0 : Designing, implementing and evaluating a costless off-site SEO strategy with backlinks originating from Web 2.0 blogs. / Bygga länkar med Web 2.0 : Designar, implementerar, och evaluerar en kostnadsfri off-site SEO strategi med länkar från Web 2.0 bloggar.

Khalil, Jacob, Edlund, Gustaf January 2020 (has links)
Purpose – The purpose of this thesis is contributing to the research on the efficacy of backlinks originating from Web 2.0 blogs by designing a costless method for creating controllable backlinks to a website, solely with Web 2.0 blogs as a source of backlinks. The objective is to find out if such links can provide an effect on a website’s positions in the Google SERPs in 2020 and to hopefully contribute with a controllable link strategy that is available for any SEO practitioner regardless of their economic circumstances. The thesis provides answers to the two research questions: 1. What positions in the SERPs can an already existing website claim as a result of creating and implementing a link strategy that utilizes Web 2.0 blogs? 2. In context of implementing a link strategy, what practices must be considered for it to remain unpunished by Google in 2020? Method – The choice of research method, due to the nature of the project is Design Science Research (DSR), in which the designed artefact is observationally evaluated by conducting a field study. The artefact consists of four unique Web 2.0 blogs that each sent a backlink to the target website through qualitative blog posts following Google’s guidelines. Quantitative data was collected using SERPWatcher by Mangools, which tracked 29 keywords for 52 days, and was qualitatively analysed. Conclusions – There is a distinct relation between the improvement in keyword positions and the implementation of the artefact, leaving us with the conclusion that it is reasonable to believe that Web 2.0 blog backlinks can affect a website’s positions in the SERPs in the modern Google Search. More research experimenting with Web 2.0 blogs as the origin of backlinks must be conducted in order to truly affirm or deny this claim, as an evaluation on solely one website is insufficient. It can be concluded that the target website was not punished by Google after implementation. While their search algorithm may be complex and intelligent, it was not intelligent enough to punish our intentions of manipulating another website’s keyword positions via a link scheme. Passing through as legitimate may have been due to following E-A-T practices and acting natural, but this is mere speculation without comparisons with similar strategies that disregard these practices. Limitations – Rigorous testing and evaluation of the designed artefact and its components is very important when conducting research that employs DSR as a method. Due to time constraints, the lack of data points in form of websites the artefact has been tested on, as well as the absence of iterative design, partially denies the validity of the artefact since it does not meet the criteria of being rigorously tested. The data collected would be more impactful if keyword data were gathered many days before executing the artefact, as a pre-implementation period larger than 7 days would act as a reference point when evaluating the effect. It would also be ideal to track the effects post-implementation for a longer time period due to the slow nature of SEO. Keywords – SEO, search engine optimization, off-page optimization, Google Search, Web 2.0, backlinks.
6

A transparência do buscador Google como direito dos consumidores na tutela do meio ambiente digital

Souza, Carolina Ferreira 14 March 2018 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2018-05-10T12:54:13Z No. of bitstreams: 1 Carolina Ferreira Souza.pdf: 1903774 bytes, checksum: fb12be505b4c7a36b6455c716c3e0891 (MD5) / Made available in DSpace on 2018-05-10T12:54:13Z (GMT). No. of bitstreams: 1 Carolina Ferreira Souza.pdf: 1903774 bytes, checksum: fb12be505b4c7a36b6455c716c3e0891 (MD5) Previous issue date: 2018-03-14 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Fundação São Paulo - FUNDASP / The present work deals with the lack of transparency of the Google search engine, analyzing what the rights of consumers users of this service are affected, as well as the damage caused to the balance of the digital environment. The capture of data and information circulating in the most different pages, programs and applications belonging to the large corporations of the network is increasingly refined and less transparent. In addition, access to existing data and information is increasingly being filtered and directed to partners, advertisers, in more or less explicit ways, depending on the service provided or the page accessed. In this way, the treatment that must be reserved for the digital environment must begin with the immediate and full universalization of Internet access and its data in favor of all without distinction or privileges, exclusions or exceptions, as well as the transparency of the algorithms that operate in the network through machine learning and other forms of artificial intelligence, under penalty of not authorizing those harmed by state inaction and / or by the action of capitalist speculation, to take the administrative and judicial measures applicable and applicable to the species for the realization of their rights. The methodology used was based on a bibliographic review to analyze the data obtained, with the collection of textual documents such as: legislation, jurisprudence, relevant doctrine, and publications of a technical nature, as well as an analysis of some of the terms of use and data policies of Google, in order to verify the lack of transparency and disrespect for human rights and Brazilian consumer rights and help in understanding the problem of commercialization and bias of the digital media. The results of the research indicated that the existing legislation is sufficient for the protection of consumers users of the Google search service in several aspects; however, considers the possibility of new data protection legislation as one more strategy within the law for the protection of human rights to communicate / O presente trabalho trata da falta de transparência do buscador Google, analisando-se quais os direitos dos consumidores usuários desse serviço são afetados, bem como o prejuízo causado ao equilíbrio do meio ambiente digital. A captação de dados e informações que circulam nas mais diferentes páginas, programas e aplicativos pertencentes às grandes corporações da rede está cada vez mais refinada e menos transparente. Além disso, o acesso aos dados e informações já existentes está sendo cada vez mais filtrado e direcionado para parceiros, anunciantes, de formas mais ou menos explícitas, dependendo do serviço prestado ou da página acessada. Desta forma, o tratamento que se deve reservar ao meio ambiente digital deve começar pela imediata e plena universalização do acesso à internet e aos seus dados em favor de todos, sem distinção ou privilégios, exclusões ou exceções, bem como pela transparência dos algoritmos que operam na rede através do aprendizado de máquina e outras formas de inteligência artificial, sob pena de, em não o sendo, autorizar os prejudicados pela inação estatal e/ou pela ação da especulação capitalista, a tomar as medidas administrativas e judiciais cabíveis e aplicáveis à espécie para a efetivação de seus direitos. A metodologia empregada valeu-se de revisão bibliográfica para análise dos dados obtidos, com a coleta de documentos textuais como: legislações, jurisprudências, doutrina pertinente, e publicações de caráter técnico, bem como análise de parte dos termos de uso e de políticas de dados do Google, a fim de se constatar a falta de transparência e o desrespeito aos direitos humanos e direitos do consumidor brasileiros e auxiliar na compreensão da problemática da mercantilização e parcialidade dos meios de comunicação digital. Os resultados da pesquisa indicaram que a legislação já existente é suficiente para a proteção dos consumidores usuários do serviço do buscador Google em diversos aspectos; entretanto, considera a possibilidade de uma nova legislação de proteção de dados como mais uma estratégia dentro do direito para a proteção dos direitos humanos à comunicar-se
7

Missbruk av dominerande ställning på marknaden för digitala sökmotorer – en fallstudie av Google Search : En komparativ studie av amerikansk antitrustlagstiftning och europeisk konkurrensrätt / Abuse of Dominance on the Market for Digital Search Engines – A Case Study of Google Search : A Comparative Legal Study of American Antitrust Law and European Competition Law

Blake Elmvall, Alma January 2016 (has links)
Konkurrenslagstiftningarna i USA och EU är ledande på konkurrensområdet i världen idag. Förbudet mot missbruk av dominerande ställning i artikel 102 FEUF och monopolistisk maktposition i section 2 Sherman Act, utgör en central del av de konkurrensrättsliga regelverken i EU och USA. Reglerna delar många likheter, men de skiljer sig åt till viss del angående reglernas syfte, formulering samt hur de tillämpas av konkurrensmyndigheter och domstolar. Den ökande digitaliseringen i samhället, i kombination med teknisk utveckling, har skapat en ny form av marknader, så kallade new economy industries, som skiljer sig åt från traditionella marknader. De nya digitala marknaderna saknar geografiska begränsningar och kännetecknas av innovation som konkurrensmässig drivkraft. Google är den mest använda digitala sökmotorn i EU och USA. Bolaget har varit under utredning av konkurrensmyndigheterna i båda rättsordningar sedan år 2010. Genom att främja sina egna tjänster i sökresultaten, anklagas Google för att hindra konkurrerande aktörer från att beträda marknaden och därmed hämma konkurrensen. Det är första gången som artikel 102 FEUF och section 2 Sherman Act tillämpas på en sökmotor. De traditionella verktyg som konkurrensmyndigheterna använder sig av vid utredningar enligt artikel 102 FEUF och section 2 Sherman Act, går dock inte att applicera tillfredsställande på digitala marknader. De konkurrensrättsliga regelverken i EU och USA har inte anpassats efter de digitala marknadernas särskilda förutsättningar, vilket skapar en osäkerhet kring konkurrensrättens effektivitet. Skillnaderna mellan rättsordningarnas tillämpning av artikel 102 FEUF och section 2 Sherman Act på Googles agerande, grundar sig således främst på de digitala marknadernas särskilda förutsättningar och det osäkra rättsläget. En reformering av de konkurrensrättsliga regelverken i EU och USA är därför nödvändig, för att säkerställa att digitala marknader i framtiden kan regleras ur ett konkurrensrättsligt perspektiv.
8

Arrangement of Google Search Results and Imperial Ideology: Searching for Benghazi, Libya

Stewart, Jacob 01 January 2014 (has links)
This project responds to an ongoing discussion in scholarship that identifies and analyzes the ideological functions of computer interfaces. In 1994, Cynthia Selfe and Richard Selfe claimed that interfaces are maps of cultural information and are therefore ideological (485). For Selfe and Selfe and other scholars, these interfaces carried a colonial ideology that resulted in Western dominance over other cultures. Since this early scholarship, our perspectives on interface have shifted with changing technology; interfaces can no longer be treated as having persistent and predictable characteristics like texts. I argue that interfaces are interactions among dynamic information that is constantly being updated online. One of the most prominent ways users interact with information online is through the use of search engines such as Google. Interfaces like Google assist users in navigating dynamic cultural information. How this information is arranged in a Google search event has a profound impact on what meaning we make surrounding the search term. In this project, I argue that colonial ideologies are upheld in several Google search events for the term "Benghazi, Libya." I claim that networked connection during Google search events leads to the creation and sustainment of a colonial ideology through patterns of arrangement. Finally, I offer a methodology for understanding how ideologies are created when search events occur. This methodology searches for patterns in connected information in order to understand how they create an ideological lens.
9

Multiple Sides, Multiple Challenges : The Need for a Uniform Approach in Defining the Relevant Product Market in Abuse of Dominance Cases on Multi-Sided Markets

Giesecke, Jacob January 2018 (has links)
The study shows that multi-sided markets pose difficulties when the relevant product market is to be defined. These difficulties pertain to two questions. The first question is whether one or several markets should be defined. In this regard, it is not easy to extract a coherent method from the cases examined. Instead, the methods applied give the impression of ad hoc-solutions, where similar circumstances result in dissimilar outcomes. Indeed, it is hard to reconcile the different market definitions in Visa International MIF and MasterCard MIF. The uncertainties are not limited to these two cases, as the methods applied in Google Shopping too give rise to ambiguities. Why was the market for general search engine platforms separated into two distinct product markets, but the market for comparison shopping services encompassed both sides? Unfortunately, this discussion was not present in the decision. This only serves to reinforce the impression that these questions are solved on an ad hoc-basis. A clear method of approaching multi-sided markets is desirable, not least because the enforcement of competition rules must be characterized by consistency and foreseeability. Hopefully, the judgement in Google Shopping will bring further clarity to this. Nonetheless, the conclusion is that one market should be defined when differences between competitive constraints on the two sides are absent. E contrario, this means that two markets should be defined when such differences are present. This is true regardless of the market in question being a transaction or a non-transaction market. This method seems preferable to strictly adhering to the division of multi-sided markets into transaction or non-transaction markets. If the Commission’s analysis is correct in that there are no differences in competitive constraints on the two sides of comparison shopping services, in combination with crossing network effects, the platform’s multi-sidedness is a necessary trait for both sides. This means that a substitute has to be multi-sided in order for it to be included on the relevant product market, which minimizes the risk for false negatives. Vice versa, the definition of two markets allows for one-sided products to be included on the relevant product market, which minimizes the risk for false positives. This is important not only for the binary finding of dominance or non-dominance, but also the degree of dominance. As concluded above, incorrectly defining one market may artificially inflate the degree of dominance into false super-dominance, and incorrectly defining several markets may artificially dilute the degree of dominance. The second question is how substitutability should be measured. It is obvious from the cases examined that qualitative measures are used and not quantitative measures. The products’ characteristics, intended use, purpose, functionalities, users’ perceptions of the product, etc. were given much attention. The SSNIP test was not applied in any of the cases. The first conclusion to be drawn from the examination above is therefore that the difficulties regarding measuring substitutability on multi-sided markets mainly concern quantitative measures. The arguments against applying a SSNIP test related to the cellophane fallacy (in two different forms, one of which was deceivingly similar to the reverse cellophane fallacy) and differences in price sensitivities between the two sides. Network effects present an additional difficulty, which may lead to exaggerated results when measuring substitutability. The second conclusion to be drawn is that there exists a reluctance to apply a SSNIP test in a way that is tailored for multi-sided markets. One method that has been proposed is to apply the test on the total sum paid by both sides, while allowing the intermediary to adjust the increase in price in accordance with its price structure. The categorical dismissal of applying the test in this way suggests that adapted versions have some time to wait before being introduced into case law and decisional practice. If they, as their proponents argue, are a robust way of broadening the evidence of possible substitutability, this is unfortunate. The risk of defining the market overly narrow or overly broad is of course present in this regard as well. A broader spectrum of evidence therefore minimizes the risk of incorrectly finding both dominance and non-dominance.
10

Noções de grafos dirigidos, cadeias de Markov e as buscas do Google

Oliveira, José Carlos Francisco de 30 August 2014 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / This paper has as its main purpose to highlight some mathematical concepts, which are behind the ranking given by a research made on the website mostly used in the world: Google. At the beginning, we briefly approached some High School’s concepts, such as: Matrices, Linear Systems and Probability. After that, we presented some basic notions related to Directed Graphs and Markov Chains of Discrete Time. From this last one, we gave more emphasis to the Steady State Vector because it ensures foreknowledge results from long-term. These concepts are extremely important to our paper, because they will be used to explain the involvement of Mathematic behind the web search “Google”. Then, we tried to detail the ranking operation of the search pages on Google, i.e., how the results of a research are classified, determining which results are presented in a sequential way in order of relevance. Finally we obtained “PageRank”, an algorithm which creates what we call Google’s Matrices and ranks the pages of a search. We finished making a brief comment about the historical arising of the web searches, from their founders to the rise and hegemony of Google. / O presente trabalho tem como objetivo destacar alguns conceitos matemáticos que estão por trás do ranqueamento dado por uma pesquisa feita no site de busca mais usados do mundo, o “Google”. Inicialmente abordamos de forma breve alguns conteúdos da matemática do ensino médio, a exemplo de: matrizes, sistemas lineares, probabilidades. Em seguida são introduzidas noções básicas de grafos dirigidos e cadeias de Markov de tempo discreto; essa última, é dada uma ênfase ao vetor estado estacionário, por ele garantir resultados de previsão de longo prazo. Esses conceitos são de grande importância em nosso trabalho, pois serão usados para explicar o envolvimento da matemática por trás do site de buscas “Google”. Na sequência, buscamos detalhar o funcionamento do ranqueamento das páginas de uma busca no “Google”, isto é, como são classificados os resultados de uma pesquisa, determinando quais resultados serão apresentados de modo sequencial em ordem de relevância. Finalmente, chegamos na obtenção do “PageRank”, algoritmo que gera a chamada Matriz do Google e ranqueia as páginas de uma busca. Encerramos com um breve histórico do surgimento dos sites de buscas, desde os seus fundadores até a ascensão e hegemonia do Google.

Page generated in 0.035 seconds