local SEO australia

local SEO australia

Search snippet optimization

image optimization strategies"Image optimization strategies outline best practices for reducing file sizes, enhancing quality, and improving metadata. Following these strategies results in faster load times, improved user experience, and better search rankings."

image optimization testing tools"Image optimization testing tools measure file sizes, load times, and display quality across devices. Using these tools helps identify areas for improvement, ensuring that your images perform well and enhance overall site performance."

image optimization toolsUsing image optimization toolssuch as compression software and format convertersstreamlines the process of reducing file sizes and improving quality. These tools help ensure that images load quickly and look great across all devices.

Best SEO Agency Sydney Australia. Best SEO Sydney Agency.

image optimization tutorials"Tutorials provide step-by-step guidance for compressing, resizing, and enhancing images. Local SEO . Following these tutorials ensures that your images are fully optimized, resulting in faster load times and better search rankings."

image optimization workflow"Establishing a clear image optimization workflow streamlines the process of compressing, resizing, and adding metadata. A well-defined workflow helps maintain consistency, improves efficiency, and ensures better image performance."

image optimization workflow automation"Workflow automation streamlines image compression, resizing, and metadata updates. By automating these tasks, you save time, maintain consistent quality, and ensure that your images remain optimized at all times."

Citations and other Useful links

image optimization workflow

image performance benchmarks"Establishing performance benchmarks for images helps measure how well they load and render on various devices. Benchmarks provide a reference point for identifying issues, refining optimization efforts, and improving overall site performance."

image performance monitoring"Image performance monitoring tracks how well images load and render on different devices. Best SEO Audit Sydney. By analyzing performance data, you can identify bottlenecks, improve load speeds, and ensure a smooth user experience."

image quality settings"Adjusting image quality settings allows you to balance clarity and file size.

Local SEO australia - Search engine optimization tools

  1. Search snippet optimization
  2. Search engine optimization tools
By optimizing these settings, you maintain a visually appealing site while improving load times and enhancing overall performance."

image optimization workflow
image optimization workflow automation

image optimization workflow automation

image scaling"Image scaling involves adjusting the size of images to match their intended display dimensions. By scaling images correctly, you prevent oversized files from slowing down your website and ensure a smooth user experience."

image scaling for retina displays"Scaling images for retina displays ensures that they appear sharp and clear on high-resolution screens. By preparing images specifically for retina-quality displays, you improve visual quality and user satisfaction on modern devices."

image SEO"Image SEO involves optimizing image filenames, alt text, captions, and metadata to improve search engine rankings. Effective image SEO increases visibility in image searches and drives more organic traffic to your website."

comprehensive SEO Services services.

image performance benchmarks

image SEO best practices"Image SEO best practices include adding descriptive alt text, optimizing filenames, using appropriate dimensions, and compressing files. Following these guidelines improves search visibility and helps attract more organic traffic to your site."

image sitemaps"An image sitemap is a file that lists the images on a website, helping search engines discover and index them. By submitting an image sitemap, you increase the visibility of your images in search results, driving more traffic to your site."

image size reduction"Image size reduction involves scaling down image dimensions to fit the intended display area. Smaller image dimensions result in faster load times, better user experience, and improved search rankings."

image performance benchmarks
image performance monitoring
image performance monitoring

image usability"Image usability focuses on selecting images that are relevant, high-quality, and aligned with the content they accompany. By ensuring that images enhance rather than detract from the user experience, you increase engagement and improve search visibility."

Industry directories for links"Industry directories for links are specialized platforms that list businesses within a particular field. Submitting your site to relevant industry directories helps establish authority, improve local search visibility, and earn quality backlinks."

industry-specific keywords"Industry-specific keywords focus on terms that are unique to your niche. By targeting these phrases, you can attract a highly relevant audience and build authority within your field."



Local SEO australia - Search engine optimization tools

  • Organic traffic
  • Search engine crawling
image quality settings

Influencer link buildingInfluencer link building involves partnering with industry influencers who can share your content and link to your site. Their endorsements not only improve your backlink profile but also increase your brands credibility and reach.

Influencer outreach for links"Influencer outreach for links involves building relationships with well-known figures in your industry who can share your content and provide backlinks. By leveraging their authority, you can increase your sites credibility and reach a larger audience."

Infographic link building"Infographic link building uses visually engaging, data-driven graphics to earn backlinks. When other websites feature your infographic and link back to your site as the source, you gain valuable backlinks and boost your contents reach."

image quality settings

 

Architecture of a Web crawler

A Web crawler, sometimes called a spider or spiderbot and often shortened to crawler, is an Internet bot that systematically browses the World Wide Web and that is typically operated by search engines for the purpose of Web indexing (web spidering).[1]

Web search engines and some other websites use Web crawling or spidering software to update their web content or indices of other sites' web content. Web crawlers copy pages for processing by a search engine, which indexes the downloaded pages so that users can search more efficiently.

Crawlers consume resources on visited systems and often visit sites unprompted. Issues of schedule, load, and "politeness" come into play when large collections of pages are accessed. Mechanisms exist for public sites not wishing to be crawled to make this known to the crawling agent. For example, including a robots.txt file can request bots to index only parts of a website, or nothing at all.

The number of Internet pages is extremely large; even the largest crawlers fall short of making a complete index. For this reason, search engines struggled to give relevant search results in the early years of the World Wide Web, before 2000. Today, relevant results are given almost instantly.

Crawlers can validate hyperlinks and HTML code. They can also be used for web scraping and data-driven programming.

Nomenclature

[edit]

A web crawler is also known as a spider,[2] an ant, an automatic indexer,[3] or (in the FOAF software context) a Web scutter.[4]

Overview

[edit]

A Web crawler starts with a list of URLs to visit. Those first URLs are called the seeds. As the crawler visits these URLs, by communicating with web servers that respond to those URLs, it identifies all the hyperlinks in the retrieved web pages and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are recursively visited according to a set of policies. If the crawler is performing archiving of websites (or web archiving), it copies and saves the information as it goes. The archives are usually stored in such a way they can be viewed, read and navigated as if they were on the live web, but are preserved as 'snapshots'.[5]

The archive is known as the repository and is designed to store and manage the collection of web pages. The repository only stores HTML pages and these pages are stored as distinct files. A repository is similar to any other system that stores data, like a modern-day database. The only difference is that a repository does not need all the functionality offered by a database system. The repository stores the most recent version of the web page retrieved by the crawler.[citation needed]

The large volume implies the crawler can only download a limited number of the Web pages within a given time, so it needs to prioritize its downloads. The high rate of change can imply the pages might have already been updated or even deleted.

The number of possible URLs crawled being generated by server-side software has also made it difficult for web crawlers to avoid retrieving duplicate content. Endless combinations of HTTP GET (URL-based) parameters exist, of which only a small selection will actually return unique content. For example, a simple online photo gallery may offer three options to users, as specified through HTTP GET parameters in the URL. If there exist four ways to sort images, three choices of thumbnail size, two file formats, and an option to disable user-provided content, then the same set of content can be accessed with 48 different URLs, all of which may be linked on the site. This mathematical combination creates a problem for crawlers, as they must sort through endless combinations of relatively minor scripted changes in order to retrieve unique content.

As Edwards et al. noted, "Given that the bandwidth for conducting crawls is neither infinite nor free, it is becoming essential to crawl the Web in not only a scalable, but efficient way, if some reasonable measure of quality or freshness is to be maintained."[6] A crawler must carefully choose at each step which pages to visit next.

Crawling policy

[edit]

The behavior of a Web crawler is the outcome of a combination of policies:[7]

  • a selection policy which states the pages to download,
  • a re-visit policy which states when to check for changes to the pages,
  • a politeness policy that states how to avoid overloading websites.
  • a parallelization policy that states how to coordinate distributed web crawlers.

Selection policy

[edit]

Given the current size of the Web, even large search engines cover only a portion of the publicly available part. A 2009 study showed even large-scale search engines index no more than 40–70% of the indexable Web;[8] a previous study by Steve Lawrence and Lee Giles showed that no search engine indexed more than 16% of the Web in 1999.[9] As a crawler always downloads just a fraction of the Web pages, it is highly desirable for the downloaded fraction to contain the most relevant pages and not just a random sample of the Web.

This requires a metric of importance for prioritizing Web pages. The importance of a page is a function of its intrinsic quality, its popularity in terms of links or visits, and even of its URL (the latter is the case of vertical search engines restricted to a single top-level domain, or search engines restricted to a fixed Web site). Designing a good selection policy has an added difficulty: it must work with partial information, as the complete set of Web pages is not known during crawling.

Junghoo Cho et al. made the first study on policies for crawling scheduling. Their data set was a 180,000-pages crawl from the stanford.edu domain, in which a crawling simulation was done with different strategies.[10] The ordering metrics tested were breadth-first, backlink count and partial PageRank calculations. One of the conclusions was that if the crawler wants to download pages with high Pagerank early during the crawling process, then the partial Pagerank strategy is the better, followed by breadth-first and backlink-count. However, these results are for just a single domain. Cho also wrote his PhD dissertation at Stanford on web crawling.[11]

Najork and Wiener performed an actual crawl on 328 million pages, using breadth-first ordering.[12] They found that a breadth-first crawl captures pages with high Pagerank early in the crawl (but they did not compare this strategy against other strategies). The explanation given by the authors for this result is that "the most important pages have many links to them from numerous hosts, and those links will be found early, regardless of on which host or page the crawl originates."

Abiteboul designed a crawling strategy based on an algorithm called OPIC (On-line Page Importance Computation).[13] In OPIC, each page is given an initial sum of "cash" that is distributed equally among the pages it points to. It is similar to a PageRank computation, but it is faster and is only done in one step. An OPIC-driven crawler downloads first the pages in the crawling frontier with higher amounts of "cash". Experiments were carried in a 100,000-pages synthetic graph with a power-law distribution of in-links. However, there was no comparison with other strategies nor experiments in the real Web.

Boldi et al. used simulation on subsets of the Web of 40 million pages from the .it domain and 100 million pages from the WebBase crawl, testing breadth-first against depth-first, random ordering and an omniscient strategy. The comparison was based on how well PageRank computed on a partial crawl approximates the true PageRank value. Some visits that accumulate PageRank very quickly (most notably, breadth-first and the omniscient visit) provide very poor progressive approximations.[14][15]

Baeza-Yates et al. used simulation on two subsets of the Web of 3 million pages from the .gr and .cl domain, testing several crawling strategies.[16] They showed that both the OPIC strategy and a strategy that uses the length of the per-site queues are better than breadth-first crawling, and that it is also very effective to use a previous crawl, when it is available, to guide the current one.

Daneshpajouh et al. designed a community based algorithm for discovering good seeds.[17] Their method crawls web pages with high PageRank from different communities in less iteration in comparison with crawl starting from random seeds. One can extract good seed from a previously-crawled-Web graph using this new method. Using these seeds, a new crawl can be very effective.

[edit]

A crawler may only want to seek out HTML pages and avoid all other MIME types. In order to request only HTML resources, a crawler may make an HTTP HEAD request to determine a Web resource's MIME type before requesting the entire resource with a GET request. To avoid making numerous HEAD requests, a crawler may examine the URL and only request a resource if the URL ends with certain characters such as .html, .htm, .asp, .aspx, .php, .jsp, .jspx or a slash. This strategy may cause numerous HTML Web resources to be unintentionally skipped.

Some crawlers may also avoid requesting any resources that have a "?" in them (are dynamically produced) in order to avoid spider traps that may cause the crawler to download an infinite number of URLs from a Web site. This strategy is unreliable if the site uses URL rewriting to simplify its URLs.

URL normalization

[edit]

Crawlers usually perform some type of URL normalization in order to avoid crawling the same resource more than once. The term URL normalization, also called URL canonicalization, refers to the process of modifying and standardizing a URL in a consistent manner. There are several types of normalization that may be performed including conversion of URLs to lowercase, removal of "." and ".." segments, and adding trailing slashes to the non-empty path component.[18]

Path-ascending crawling

[edit]

Some crawlers intend to download/upload as many resources as possible from a particular web site. So path-ascending crawler was introduced that would ascend to every path in each URL that it intends to crawl.[19] For example, when given a seed URL of http://llama.org/hamster/monkey/page.html, it will attempt to crawl /hamster/monkey/, /hamster/, and /. Cothey found that a path-ascending crawler was very effective in finding isolated resources, or resources for which no inbound link would have been found in regular crawling.

Focused crawling

[edit]

The importance of a page for a crawler can also be expressed as a function of the similarity of a page to a given query. Web crawlers that attempt to download pages that are similar to each other are called focused crawler or topical crawlers. The concepts of topical and focused crawling were first introduced by Filippo Menczer[20][21] and by Soumen Chakrabarti et al.[22]

The main problem in focused crawling is that in the context of a Web crawler, we would like to be able to predict the similarity of the text of a given page to the query before actually downloading the page. A possible predictor is the anchor text of links; this was the approach taken by Pinkerton[23] in the first web crawler of the early days of the Web. Diligenti et al.[24] propose using the complete content of the pages already visited to infer the similarity between the driving query and the pages that have not been visited yet. The performance of a focused crawling depends mostly on the richness of links in the specific topic being searched, and a focused crawling usually relies on a general Web search engine for providing starting points.

Academic focused crawler
[edit]

An example of the focused crawlers are academic crawlers, which crawls free-access academic related documents, such as the citeseerxbot, which is the crawler of CiteSeerX search engine. Other academic search engines are Google Scholar and Microsoft Academic Search etc. Because most academic papers are published in PDF formats, such kind of crawler is particularly interested in crawling PDF, PostScript files, Microsoft Word including their zipped formats. Because of this, general open-source crawlers, such as Heritrix, must be customized to filter out other MIME types, or a middleware is used to extract these documents out and import them to the focused crawl database and repository.[25] Identifying whether these documents are academic or not is challenging and can add a significant overhead to the crawling process, so this is performed as a post crawling process using machine learning or regular expression algorithms. These academic documents are usually obtained from home pages of faculties and students or from publication page of research institutes. Because academic documents make up only a small fraction of all web pages, a good seed selection is important in boosting the efficiencies of these web crawlers.[26] Other academic crawlers may download plain text and HTML files, that contains metadata of academic papers, such as titles, papers, and abstracts. This increases the overall number of papers, but a significant fraction may not provide free PDF downloads.

Semantic focused crawler
[edit]

Another type of focused crawlers is semantic focused crawler, which makes use of domain ontologies to represent topical maps and link Web pages with relevant ontological concepts for the selection and categorization purposes.[27] In addition, ontologies can be automatically updated in the crawling process. Dong et al.[28] introduced such an ontology-learning-based crawler using a support-vector machine to update the content of ontological concepts when crawling Web pages.

Re-visit policy

[edit]

The Web has a very dynamic nature, and crawling a fraction of the Web can take weeks or months. By the time a Web crawler has finished its crawl, many events could have happened, including creations, updates, and deletions.

From the search engine's point of view, there is a cost associated with not detecting an event, and thus having an outdated copy of a resource. The most-used cost functions are freshness and age.[29]

Freshness: This is a binary measure that indicates whether the local copy is accurate or not. The freshness of a page p in the repository at time t is defined as:

Age: This is a measure that indicates how outdated the local copy is. The age of a page p in the repository, at time t is defined as:

Coffman et al. worked with a definition of the objective of a Web crawler that is equivalent to freshness, but use a different wording: they propose that a crawler must minimize the fraction of time pages remain outdated. They also noted that the problem of Web crawling can be modeled as a multiple-queue, single-server polling system, on which the Web crawler is the server and the Web sites are the queues. Page modifications are the arrival of the customers, and switch-over times are the interval between page accesses to a single Web site. Under this model, mean waiting time for a customer in the polling system is equivalent to the average age for the Web crawler.[30]

The objective of the crawler is to keep the average freshness of pages in its collection as high as possible, or to keep the average age of pages as low as possible. These objectives are not equivalent: in the first case, the crawler is just concerned with how many pages are outdated, while in the second case, the crawler is concerned with how old the local copies of pages are.

Evolution of Freshness and Age in a web crawler

Two simple re-visiting policies were studied by Cho and Garcia-Molina:[31]

  • Uniform policy: This involves re-visiting all pages in the collection with the same frequency, regardless of their rates of change.
  • Proportional policy: This involves re-visiting more often the pages that change more frequently. The visiting frequency is directly proportional to the (estimated) change frequency.

In both cases, the repeated crawling order of pages can be done either in a random or a fixed order.

Cho and Garcia-Molina proved the surprising result that, in terms of average freshness, the uniform policy outperforms the proportional policy in both a simulated Web and a real Web crawl. Intuitively, the reasoning is that, as web crawlers have a limit to how many pages they can crawl in a given time frame, (1) they will allocate too many new crawls to rapidly changing pages at the expense of less frequently updating pages, and (2) the freshness of rapidly changing pages lasts for shorter period than that of less frequently changing pages. In other words, a proportional policy allocates more resources to crawling frequently updating pages, but experiences less overall freshness time from them.

To improve freshness, the crawler should penalize the elements that change too often.[32] The optimal re-visiting policy is neither the uniform policy nor the proportional policy. The optimal method for keeping average freshness high includes ignoring the pages that change too often, and the optimal for keeping average age low is to use access frequencies that monotonically (and sub-linearly) increase with the rate of change of each page. In both cases, the optimal is closer to the uniform policy than to the proportional policy: as Coffman et al. note, "in order to minimize the expected obsolescence time, the accesses to any particular page should be kept as evenly spaced as possible".[30] Explicit formulas for the re-visit policy are not attainable in general, but they are obtained numerically, as they depend on the distribution of page changes. Cho and Garcia-Molina show that the exponential distribution is a good fit for describing page changes,[32] while Ipeirotis et al. show how to use statistical tools to discover parameters that affect this distribution.[33] The re-visiting policies considered here regard all pages as homogeneous in terms of quality ("all pages on the Web are worth the same"), something that is not a realistic scenario, so further information about the Web page quality should be included to achieve a better crawling policy.

Politeness policy

[edit]

Crawlers can retrieve data much quicker and in greater depth than human searchers, so they can have a crippling impact on the performance of a site. If a single crawler is performing multiple requests per second and/or downloading large files, a server can have a hard time keeping up with requests from multiple crawlers.

As noted by Koster, the use of Web crawlers is useful for a number of tasks, but comes with a price for the general community.[34] The costs of using Web crawlers include:

  • network resources, as crawlers require considerable bandwidth and operate with a high degree of parallelism during a long period of time;
  • server overload, especially if the frequency of accesses to a given server is too high;
  • poorly written crawlers, which can crash servers or routers, or which download pages they cannot handle; and
  • personal crawlers that, if deployed by too many users, can disrupt networks and Web servers.

A partial solution to these problems is the robots exclusion protocol, also known as the robots.txt protocol that is a standard for administrators to indicate which parts of their Web servers should not be accessed by crawlers.[35] This standard does not include a suggestion for the interval of visits to the same server, even though this interval is the most effective way of avoiding server overload. Recently commercial search engines like Google, Ask Jeeves, MSN and Yahoo! Search are able to use an extra "Crawl-delay:" parameter in the robots.txt file to indicate the number of seconds to delay between requests.

The first proposed interval between successive pageloads was 60 seconds.[36] However, if pages were downloaded at this rate from a website with more than 100,000 pages over a perfect connection with zero latency and infinite bandwidth, it would take more than 2 months to download only that entire Web site; also, only a fraction of the resources from that Web server would be used.

Cho uses 10 seconds as an interval for accesses,[31] and the WIRE crawler uses 15 seconds as the default.[37] The MercatorWeb crawler follows an adaptive politeness policy: if it took t seconds to download a document from a given server, the crawler waits for 10t seconds before downloading the next page.[38] Dill et al. use 1 second.[39]

For those using Web crawlers for research purposes, a more detailed cost-benefit analysis is needed and ethical considerations should be taken into account when deciding where to crawl and how fast to crawl.[40]

Anecdotal evidence from access logs shows that access intervals from known crawlers vary between 20 seconds and 3–4 minutes. It is worth noticing that even when being very polite, and taking all the safeguards to avoid overloading Web servers, some complaints from Web server administrators are received. Sergey Brin and Larry Page noted in 1998, "... running a crawler which connects to more than half a million servers ... generates a fair amount of e-mail and phone calls. Because of the vast number of people coming on line, there are always those who do not know what a crawler is, because this is the first one they have seen."[41]

Parallelization policy

[edit]

A parallel crawler is a crawler that runs multiple processes in parallel. The goal is to maximize the download rate while minimizing the overhead from parallelization and to avoid repeated downloads of the same page. To avoid downloading the same page more than once, the crawling system requires a policy for assigning the new URLs discovered during the crawling process, as the same URL can be found by two different crawling processes.

Architectures

[edit]
High-level architecture of a standard Web crawler

A crawler must not only have a good crawling strategy, as noted in the previous sections, but it should also have a highly optimized architecture.

Shkapenyuk and Suel noted that:[42]

While it is fairly easy to build a slow crawler that downloads a few pages per second for a short period of time, building a high-performance system that can download hundreds of millions of pages over several weeks presents a number of challenges in system design, I/O and network efficiency, and robustness and manageability.

Web crawlers are a central part of search engines, and details on their algorithms and architecture are kept as business secrets. When crawler designs are published, there is often an important lack of detail that prevents others from reproducing the work. There are also emerging concerns about "search engine spamming", which prevent major search engines from publishing their ranking algorithms.

Security

[edit]

While most of the website owners are keen to have their pages indexed as broadly as possible to have strong presence in search engines, web crawling can also have unintended consequences and lead to a compromise or data breach if a search engine indexes resources that should not be publicly available, or pages revealing potentially vulnerable versions of software.

Apart from standard web application security recommendations website owners can reduce their exposure to opportunistic hacking by only allowing search engines to index the public parts of their websites (with robots.txt) and explicitly blocking them from indexing transactional parts (login pages, private pages, etc.).

Crawler identification

[edit]

Web crawlers typically identify themselves to a Web server by using the User-agent field of an HTTP request. Web site administrators typically examine their Web servers' log and use the user agent field to determine which crawlers have visited the web server and how often. The user agent field may include a URL where the Web site administrator may find out more information about the crawler. Examining Web server log is tedious task, and therefore some administrators use tools to identify, track and verify Web crawlers. Spambots and other malicious Web crawlers are unlikely to place identifying information in the user agent field, or they may mask their identity as a browser or other well-known crawler.

Web site administrators prefer Web crawlers to identify themselves so that they can contact the owner if needed. In some cases, crawlers may be accidentally trapped in a crawler trap or they may be overloading a Web server with requests, and the owner needs to stop the crawler. Identification is also useful for administrators that are interested in knowing when they may expect their Web pages to be indexed by a particular search engine.

Crawling the deep web

[edit]

A vast amount of web pages lie in the deep or invisible web.[43] These pages are typically only accessible by submitting queries to a database, and regular crawlers are unable to find these pages if there are no links that point to them. Google's Sitemaps protocol and mod oai[44] are intended to allow discovery of these deep-Web resources.

Deep web crawling also multiplies the number of web links to be crawled. Some crawlers only take some of the URLs in <a href="URL"> form. In some cases, such as the Googlebot, Web crawling is done on all text contained inside the hypertext content, tags, or text.

Strategic approaches may be taken to target deep Web content. With a technique called screen scraping, specialized software may be customized to automatically and repeatedly query a given Web form with the intention of aggregating the resulting data. Such software can be used to span multiple Web forms across multiple Websites. Data extracted from the results of one Web form submission can be taken and applied as input to another Web form thus establishing continuity across the Deep Web in a way not possible with traditional web crawlers.[45]

Pages built on AJAX are among those causing problems to web crawlers. Google has proposed a format of AJAX calls that their bot can recognize and index.[46]

Visual vs programmatic crawlers

[edit]

There are a number of "visual web scraper/crawler" products available on the web which will crawl pages and structure data into columns and rows based on the users requirements. One of the main difference between a classic and a visual crawler is the level of programming ability required to set up a crawler. The latest generation of "visual scrapers" remove the majority of the programming skill needed to be able to program and start a crawl to scrape web data.

The visual scraping/crawling method relies on the user "teaching" a piece of crawler technology, which then follows patterns in semi-structured data sources. The dominant method for teaching a visual crawler is by highlighting data in a browser and training columns and rows. While the technology is not new, for example it was the basis of Needlebase which has been bought by Google (as part of a larger acquisition of ITA Labs[47]), there is continued growth and investment in this area by investors and end-users.[citation needed]

List of web crawlers

[edit]

The following is a list of published crawler architectures for general-purpose crawlers (excluding focused web crawlers), with a brief description that includes the names given to the different components and outstanding features:

Historical web crawlers

[edit]
  • WolfBot was a massively multi threaded crawler built in 2001 by Mani Singh a Civil Engineering graduate from the University of California at Davis.
  • World Wide Web Worm was a crawler used to build a simple index of document titles and URLs. The index could be searched by using the grep Unix command.
  • Yahoo! Slurp was the name of the Yahoo! Search crawler until Yahoo! contracted with Microsoft to use Bingbot instead.

In-house web crawlers

[edit]
  • Applebot is Apple's web crawler. It supports Siri and other products.[48]
  • Bingbot is the name of Microsoft's Bing webcrawler. It replaced Msnbot.
  • Baiduspider is Baidu's web crawler.
  • DuckDuckBot is DuckDuckGo's web crawler.
  • Googlebot is described in some detail, but the reference is only about an early version of its architecture, which was written in C++ and Python. The crawler was integrated with the indexing process, because text parsing was done for full-text indexing and also for URL extraction. There is a URL server that sends lists of URLs to be fetched by several crawling processes. During parsing, the URLs found were passed to a URL server that checked if the URL have been previously seen. If not, the URL was added to the queue of the URL server.
  • WebCrawler was used to build the first publicly available full-text index of a subset of the Web. It was based on lib-WWW to download pages, and another program to parse and order URLs for breadth-first exploration of the Web graph. It also included a real-time crawler that followed links based on the similarity of the anchor text with the provided query.
  • WebFountain is a distributed, modular crawler similar to Mercator but written in C++.
  • Xenon is a web crawler used by government tax authorities to detect fraud.[49][50]

Commercial web crawlers

[edit]

The following web crawlers are available, for a price::

Open-source crawlers

[edit]
  • Apache Nutch is a highly extensible and scalable web crawler written in Java and released under an Apache License. It is based on Apache Hadoop and can be used with Apache Solr or Elasticsearch.
  • Grub was an open source distributed search crawler that Wikia Search used to crawl the web.
  • Heritrix is the Internet Archive's archival-quality crawler, designed for archiving periodic snapshots of a large portion of the Web. It was written in Java.
  • ht://Dig includes a Web crawler in its indexing engine.
  • HTTrack uses a Web crawler to create a mirror of a web site for off-line viewing. It is written in C and released under the GPL.
  • Norconex Web Crawler is a highly extensible Web Crawler written in Java and released under an Apache License. It can be used with many repositories such as Apache Solr, Elasticsearch, Microsoft Azure Cognitive Search, Amazon CloudSearch and more.
  • mnoGoSearch is a crawler, indexer and a search engine written in C and licensed under the GPL (*NIX machines only)
  • Open Search Server is a search engine and web crawler software release under the GPL.
  • Scrapy, an open source webcrawler framework, written in python (licensed under BSD).
  • Seeks, a free distributed search engine (licensed under AGPL).
  • StormCrawler, a collection of resources for building low-latency, scalable web crawlers on Apache Storm (Apache License).
  • tkWWW Robot, a crawler based on the tkWWW web browser (licensed under GPL).
  • GNU Wget is a command-line-operated crawler written in C and released under the GPL. It is typically used to mirror Web and FTP sites.
  • YaCy, a free distributed search engine, built on principles of peer-to-peer networks (licensed under GPL).

See also

[edit]

References

[edit]
  1. ^ "Web Crawlers: Browsing the Web". Archived from the original on 6 December 2021.
  2. ^ Spetka, Scott. "The TkWWW Robot: Beyond Browsing". NCSA. Archived from the original on 3 September 2004. Retrieved 21 November 2010.
  3. ^ Kobayashi, M. & Takeda, K. (2000). "Information retrieval on the web". ACM Computing Surveys. 32 (2): 144–173. CiteSeerX 10.1.1.126.6094. doi:10.1145/358923.358934. S2CID 3710903.
  4. ^ See definition of scutter on FOAF Project's wiki Archived 13 December 2009 at the Wayback Machine
  5. ^ Masanès, Julien (15 February 2007). Web Archiving. Springer. p. 1. ISBN 978-3-54046332-0. Retrieved 24 April 2014.
  6. ^ Edwards, J.; McCurley, K. S.; and Tomlin, J. A. (2001). "An adaptive model for optimizing performance of an incremental web crawler". Proceedings of the 10th international conference on World Wide Web. pp. 106–113. CiteSeerX 10.1.1.1018.1506. doi:10.1145/371920.371960. ISBN 978-1581133486. S2CID 10316730. Archived from the original on 25 June 2014. Retrieved 25 January 2007.cite book: CS1 maint: multiple names: authors list (link)
  7. ^ Castillo, Carlos (2004). Effective Web Crawling (PhD thesis). University of Chile. Retrieved 3 August 2010.
  8. ^ Gulls, A.; A. Signori (2005). "The indexable web is more than 11.5 billion pages". Special interest tracks and posters of the 14th international conference on World Wide Web. ACM Press. pp. 902–903. doi:10.1145/1062745.1062789.
  9. ^ Lawrence, Steve; C. Lee Giles (8 July 1999). "Accessibility of information on the web". Nature. 400 (6740): 107–9. Bibcode:1999Natur.400..107L. doi:10.1038/21987. PMID 10428673. S2CID 4347646.
  10. ^ Cho, J.; Garcia-Molina, H.; Page, L. (April 1998). "Efficient Crawling Through URL Ordering". Seventh International World-Wide Web Conference. Brisbane, Australia. doi:10.1142/3725. ISBN 978-981-02-3400-3. Retrieved 23 March 2009.
  11. ^ Cho, Junghoo, "Crawling the Web: Discovery and Maintenance of a Large-Scale Web Data", PhD dissertation, Department of Computer Science, Stanford University, November 2001.
  12. ^ Najork, Marc and Janet L. Wiener. "Breadth-first crawling yields high-quality pages". Archived 24 December 2017 at the Wayback Machine In: Proceedings of the Tenth Conference on World Wide Web, pages 114–118, Hong Kong, May 2001. Elsevier Science.
  13. ^ Abiteboul, Serge; Mihai Preda; Gregory Cobena (2003). "Adaptive on-line page importance computation". Proceedings of the 12th international conference on World Wide Web. Budapest, Hungary: ACM. pp. 280–290. doi:10.1145/775152.775192. ISBN 1-58113-680-3. Retrieved 22 March 2009.
  14. ^ Boldi, Paolo; Bruno Codenotti; Massimo Santini; Sebastiano Vigna (2004). "UbiCrawler: a scalable fully distributed Web crawler" (PDF). Software: Practice and Experience. 34 (8): 711–726. CiteSeerX 10.1.1.2.5538. doi:10.1002/spe.587. S2CID 325714. Archived from the original (PDF) on 20 March 2009. Retrieved 23 March 2009.
  15. ^ Boldi, Paolo; Massimo Santini; Sebastiano Vigna (2004). "Do Your Worst to Make the Best: Paradoxical Effects in PageRank Incremental Computations" (PDF). Algorithms and Models for the Web-Graph. Lecture Notes in Computer Science. Vol. 3243. pp. 168–180. doi:10.1007/978-3-540-30216-2_14. ISBN 978-3-540-23427-2. Archived from the original (PDF) on 1 October 2005. Retrieved 23 March 2009.
  16. ^ Baeza-Yates, R.; Castillo, C.; Marin, M. and Rodriguez, A. (2005). "Crawling a Country: Better Strategies than Breadth-First for Web Page Ordering." In: Proceedings of the Industrial and Practical Experience track of the 14th conference on World Wide Web, pages 864–872, Chiba, Japan. ACM Press.
  17. ^ Shervin Daneshpajouh, Mojtaba Mohammadi Nasiri, Mohammad Ghodsi, A Fast Community Based Algorithm for Generating Crawler Seeds Set. In: Proceedings of 4th International Conference on Web Information Systems and Technologies (Webist-2008), Funchal, Portugal, May 2008.
  18. ^ Pant, Gautam; Srinivasan, Padmini; Menczer, Filippo (2004). "Crawling the Web" (PDF). In Levene, Mark; Poulovassilis, Alexandra (eds.). Web Dynamics: Adapting to Change in Content, Size, Topology and Use. Springer. pp. 153–178. ISBN 978-3-540-40676-1. Archived from the original (PDF) on 20 March 2009. Retrieved 9 May 2006.
  19. ^ Cothey, Viv (2004). "Web-crawling reliability" (PDF). Journal of the American Society for Information Science and Technology. 55 (14): 1228–1238. CiteSeerX 10.1.1.117.185. doi:10.1002/asi.20078.
  20. ^ Menczer, F. (1997). ARACHNID: Adaptive Retrieval Agents Choosing Heuristic Neighborhoods for Information Discovery Archived 21 December 2012 at the Wayback Machine. In D. Fisher, ed., Machine Learning: Proceedings of the 14th International Conference (ICML97). Morgan Kaufmann
  21. ^ Menczer, F. and Belew, R.K. (1998). Adaptive Information Agents in Distributed Textual Environments Archived 21 December 2012 at the Wayback Machine. In K. Sycara and M. Wooldridge (eds.) Proc. 2nd Intl. Conf. on Autonomous Agents (Agents '98). ACM Press
  22. ^ Chakrabarti, Soumen; Van Den Berg, Martin; Dom, Byron (1999). "Focused crawling: A new approach to topic-specific Web resource discovery" (PDF). Computer Networks. 31 (11–16): 1623–1640. doi:10.1016/s1389-1286(99)00052-3. Archived from the original (PDF) on 17 March 2004.
  23. ^ Pinkerton, B. (1994). Finding what people want: Experiences with the WebCrawler. In Proceedings of the First World Wide Web Conference, Geneva, Switzerland.
  24. ^ Diligenti, M., Coetzee, F., Lawrence, S., Giles, C. L., and Gori, M. (2000). Focused crawling using context graphs. In Proceedings of 26th International Conference on Very Large Databases (VLDB), pages 527-534, Cairo, Egypt.
  25. ^ Wu, Jian; Teregowda, Pradeep; Khabsa, Madian; Carman, Stephen; Jordan, Douglas; San Pedro Wandelmer, Jose; Lu, Xin; Mitra, Prasenjit; Giles, C. Lee (2012). "Web crawler middleware for search engine digital libraries". Proceedings of the twelfth international workshop on Web information and data management - WIDM '12. p. 57. doi:10.1145/2389936.2389949. ISBN 9781450317207. S2CID 18513666.
  26. ^ Wu, Jian; Teregowda, Pradeep; Ramírez, Juan Pablo Fernández; Mitra, Prasenjit; Zheng, Shuyi; Giles, C. Lee (2012). "The evolution of a crawling strategy for an academic document search engine". Proceedings of the 3rd Annual ACM Web Science Conference on - Web Sci '12. pp. 340–343. doi:10.1145/2380718.2380762. ISBN 9781450312288. S2CID 16718130.
  27. ^ Dong, Hai; Hussain, Farookh Khadeer; Chang, Elizabeth (2009). "State of the Art in Semantic Focused Crawlers". Computational Science and Its Applications – ICCSA 2009. Lecture Notes in Computer Science. Vol. 5593. pp. 910–924. doi:10.1007/978-3-642-02457-3_74. hdl:20.500.11937/48288. ISBN 978-3-642-02456-6.
  28. ^ Dong, Hai; Hussain, Farookh Khadeer (2013). "SOF: A semi-supervised ontology-learning-based focused crawler". Concurrency and Computation: Practice and Experience. 25 (12): 1755–1770. doi:10.1002/cpe.2980. S2CID 205690364.
  29. ^ Junghoo Cho; Hector Garcia-Molina (2000). "Synchronizing a database to improve freshness" (PDF). Proceedings of the 2000 ACM SIGMOD international conference on Management of data. Dallas, Texas, United States: ACM. pp. 117–128. doi:10.1145/342009.335391. ISBN 1-58113-217-4. Retrieved 23 March 2009.
  30. ^ a b E. G. Coffman Jr; Zhen Liu; Richard R. Weber (1998). "Optimal robot scheduling for Web search engines". Journal of Scheduling. 1 (1): 15–29. CiteSeerX 10.1.1.36.6087. doi:10.1002/(SICI)1099-1425(199806)1:1<15::AID-JOS3>3.0.CO;2-K.
  31. ^ a b Cho, Junghoo; Garcia-Molina, Hector (2003). "Effective page refresh policies for Web crawlers". ACM Transactions on Database Systems. 28 (4): 390–426. doi:10.1145/958942.958945. S2CID 147958.
  32. ^ a b Junghoo Cho; Hector Garcia-Molina (2003). "Estimating frequency of change". ACM Transactions on Internet Technology. 3 (3): 256–290. CiteSeerX 10.1.1.59.5877. doi:10.1145/857166.857170. S2CID 9362566.
  33. ^ Ipeirotis, P., Ntoulas, A., Cho, J., Gravano, L. (2005) Modeling and managing content changes in text databases Archived 5 September 2005 at the Wayback Machine. In Proceedings of the 21st IEEE International Conference on Data Engineering, pages 606-617, April 2005, Tokyo.
  34. ^ Koster, M. (1995). Robots in the web: threat or treat? ConneXions, 9(4).
  35. ^ Koster, M. (1996). A standard for robot exclusion Archived 7 November 2007 at the Wayback Machine.
  36. ^ Koster, M. (1993). Guidelines for robots writers Archived 22 April 2005 at the Wayback Machine.
  37. ^ Baeza-Yates, R. and Castillo, C. (2002). Balancing volume, quality and freshness in Web crawling. In Soft Computing Systems – Design, Management and Applications, pages 565–572, Santiago, Chile. IOS Press Amsterdam.
  38. ^ Heydon, Allan; Najork, Marc (26 June 1999). "Mercator: A Scalable, Extensible Web Crawler" (PDF). Archived from the original (PDF) on 19 February 2006. Retrieved 22 March 2009. cite journal: Cite journal requires |journal= (help)
  39. ^ Dill, S.; Kumar, R.; Mccurley, K. S.; Rajagopalan, S.; Sivakumar, D.; Tomkins, A. (2002). "Self-similarity in the web" (PDF). ACM Transactions on Internet Technology. 2 (3): 205–223. doi:10.1145/572326.572328. S2CID 6416041.
  40. ^ M. Thelwall; D. Stuart (2006). "Web crawling ethics revisited: Cost, privacy and denial of service". Journal of the American Society for Information Science and Technology. 57 (13): 1771–1779. doi:10.1002/asi.20388.
  41. ^ Brin, Sergey; Page, Lawrence (1998). "The anatomy of a large-scale hypertextual Web search engine". Computer Networks and ISDN Systems. 30 (1–7): 107–117. doi:10.1016/s0169-7552(98)00110-x. S2CID 7587743.
  42. ^ Shkapenyuk, V. and Suel, T. (2002). Design and implementation of a high performance distributed web crawler. In Proceedings of the 18th International Conference on Data Engineering (ICDE), pages 357-368, San Jose, California. IEEE CS Press.
  43. ^ Shestakov, Denis (2008). Search Interfaces on the Web: Querying and Characterizing Archived 6 July 2014 at the Wayback Machine. TUCS Doctoral Dissertations 104, University of Turku
  44. ^ Michael L Nelson; Herbert Van de Sompel; Xiaoming Liu; Terry L Harrison; Nathan McFarland (24 March 2005). "mod_oai: An Apache Module for Metadata Harvesting": cs/0503069. arXiv:cs/0503069. Bibcode:2005cs........3069N. cite journal: Cite journal requires |journal= (help)
  45. ^ Shestakov, Denis; Bhowmick, Sourav S.; Lim, Ee-Peng (2005). "DEQUE: Querying the Deep Web" (PDF). Data & Knowledge Engineering. 52 (3): 273–311. doi:10.1016/s0169-023x(04)00107-7.
  46. ^ "AJAX crawling: Guide for webmasters and developers". Retrieved 17 March 2013.
  47. ^ ITA Labs "ITA Labs Acquisition" Archived 18 March 2014 at the Wayback Machine 20 April 2011 1:28 AM
  48. ^ "About Applebot". Apple Inc. Retrieved 18 October 2021.
  49. ^ Norton, Quinn (25 January 2007). "Tax takers send in the spiders". Business. Wired. Archived from the original on 22 December 2016. Retrieved 13 October 2017.
  50. ^ "Xenon web crawling initiative: privacy impact assessment (PIA) summary". Ottawa: Government of Canada. 11 April 2017. Archived from the original on 25 September 2017. Retrieved 13 October 2017.

Further reading

[edit]

 

 

Google Maps
Screenshot of Google Maps in a web browser
Type of site
Web mapping
Available in 74 languages
List of languages
Afrikaans, Azerbaijani, Indonesian, Malay, Bosnian, Catalan, Czech, Danish, German (Germany), Estonian, English (United States), Spanish (Spain), Spanish (Latin America), Basque, Filipino, French (France), Galician, Croatian, Zulu, Icelandic, Italian, Swahili, Latvian, Lithuanian, Hungarian, Dutch, Norwegian, Uzbek, Polish, Portuguese (Brazil), Portuguese (Portugal), Romanian, Albanian, Slovak, Slovenian, Finnish, Swedish, Vietnamese, Turkish, Greek, Bulgarian, Kyrgyz, Kazakh, Macedonian, Mongolian, Russian, Serbian, Ukrainian, Georgian, Armenian, Hebrew, Urdu, Arabic, Persian, Amharic, Nepali, Hindi, Marathi, Bengali, Punjabi, Gujarati, Tamil, Telugu, Kannada, Malayalam, Sinhala, Thai, Lao, Burmese, Khmer, Korean, Japanese, Simplified Chinese, Traditional Chinese
Owner Google
URL google.com/maps Edit this at Wikidata
Commercial Yes
Registration Optional, included with a Google Account
Launched February 8, 2005; 20 years ago (2005-02-08)
Current status Active
Written in C++ (back-end), JavaScript, XML, Ajax (UI)

Google Maps is a web mapping platform and consumer application offered by Google. It offers satellite imagery, aerial photography, street maps, 360° interactive panoramic views of streets (Street View), real-time traffic conditions, and route planning for traveling by foot, car, bike, air (in beta) and public transportation. As of 2020, Google Maps was being used by over one billion people every month around the world.[1]

Google Maps began as a C++ desktop program developed by brothers Lars and Jens Rasmussen in Australia at Where 2 Technologies. In October 2004, the company was acquired by Google, which converted it into a web application. After additional acquisitions of a geospatial data visualization company and a real-time traffic analyzer, Google Maps was launched in February 2005.[2] The service's front end utilizes JavaScript, XML, and Ajax. Google Maps offers an API that allows maps to be embedded on third-party websites,[3] and offers a locator for businesses and other organizations in numerous countries around the world. Google Map Maker allowed users to collaboratively expand and update the service's mapping worldwide but was discontinued from March 2017. However, crowdsourced contributions to Google Maps were not discontinued as the company announced those features would be transferred to the Google Local Guides program,[4] although users that are not Local Guides can still contribute.

Google Maps' satellite view is a "top-down" or bird's-eye view; most of the high-resolution imagery of cities is aerial photography taken from aircraft flying at 800 to 1,500 feet (240 to 460 m), while most other imagery is from satellites.[5] Much of the available satellite imagery is no more than three years old and is updated on a regular basis, according to a 2011 report.[6] Google Maps previously used a variant of the Mercator projection, and therefore could not accurately show areas around the poles.[7] In August 2018, the desktop version of Google Maps was updated to show a 3D globe. It is still possible to switch back to the 2D map in the settings.

Google Maps for mobile devices was first released in 2006; the latest versions feature GPS turn-by-turn navigation along with dedicated parking assistance features. By 2013, it was found to be the world's most popular smartphone app, with over 54% of global smartphone owners using it.[8] In 2017, the app was reported to have two billion users on Android, along with several other Google services including YouTube, Chrome, Gmail, Search, and Google Play.

History

[edit]

Acquisitions

[edit]

Google Maps first started as a C++ program designed by two Danish brothers, Lars and Jens Eilstrup Rasmussen, and Noel Gordon and Stephen Ma, at the Sydney-based company Where 2 Technologies, which was founded in early 2003. The program was initially designed to be separately downloaded by users, but the company later pitched the idea for a purely Web-based product to Google management, changing the method of distribution.[9] In October 2004, the company was acquired by Google Inc.[10] where it transformed into the web application Google Maps. The Rasmussen brothers, Gordon and Ma joined Google at that time.

In the same month, Google acquired Keyhole, a geospatial data visualization company (with investment from the CIA), whose marquee application suite, Earth Viewer, emerged as the Google Earth application in 2005 while other aspects of its core technology were integrated into Google Maps.[11] In September 2004, Google acquired ZipDash, a company that provided real-time traffic analysis.[12]

2005–2010

[edit]
Google Maps Beta in 2005

The launch of Google Maps was first announced on the Google Blog on February 8, 2005.[13]

In September 2005, in the aftermath of Hurricane Katrina, Google Maps quickly updated its satellite imagery of New Orleans to allow users to view the extent of the flooding in various parts of that city.[14][15]

As of 2007, Google Maps was equipped with a miniature view with a draggable rectangle that denotes the area shown in the main viewport, and "Info windows" for previewing details about locations on maps.[16] As of 2024, this feature had been removed (likely several years prior).

Original Google Maps icon

On November 28, 2007, Google Maps for Mobile 2.0 was released.[17][18][19] It featured a beta version of a "My Location" feature, which uses the GPS / Assisted GPS location of the mobile device, if available, supplemented by determining the nearest wireless networks and cell sites.[18][19] The software looks up the location of the cell site using a database of known wireless networks and sites.[20][21] By triangulating the different signal strengths from cell transmitters and then using their location property (retrieved from the database), My Location determines the user's current location.[22]

On September 23, 2008, coinciding with the announcement of the first commercial Android device, Google announced that a Google Maps app had been released for its Android operating system.[23][24]

In October 2009, Google replaced Tele Atlas as their primary supplier of geospatial data in the US version of Maps and used their own data.[25]

2011–2015

[edit]

On April 19, 2011, Map Maker was added to the American version of Google Maps, allowing any viewer to edit and add changes to Google Maps. This provides Google with local map updates almost in real-time instead of waiting for digital map data companies to release more infrequent updates.

Icon used from 2015 to 2020

On January 31, 2012, Google, due to offering its Maps for free, was found guilty of abusing the dominant position of its Google Maps application and ordered by a court to pay a fine and damages to Bottin Cartographer, a French mapping company.[26] This ruling was overturned on appeal.[27]

In June 2012, Google started mapping the UK's rivers and canals in partnership with the Canal and River Trust. The company has stated that "it would update the program during the year to allow users to plan trips which include locks, bridges and towpaths along the 2,000 miles of river paths in the UK."[28]

A monument in the shape of a Google Maps pin in the center of the city of Szczecin, Poland

In December 2012, the Google Maps application was separately made available in the App Store, after Apple removed it from its default installation of the mobile operating system version iOS 6 in September 2012.[29]

On January 29, 2013, Google Maps was updated to include a map of North Korea.[30] As of May 3, 2013, Google Maps recognizes Palestine as a country, instead of redirecting to the Palestinian territories.[31]

In August 2013, Google Maps removed the Wikipedia Layer, which provided links to Wikipedia content about locations shown in Google Maps using Wikipedia geocodes.[32]

On April 12, 2014, Google Maps was updated to reflect the annexation of Ukrainian Crimea by Russia. Crimea is shown as the Republic of Crimea in Russia and as the Autonomous Republic of Crimea in Ukraine. All other versions show a dotted disputed border.[33]

In April 2015, on a map near the Pakistani city of Rawalpindi, the imagery of the Android logo urinating on the Apple logo was added via Map Maker and appeared on Google Maps. The vandalism was soon removed and Google publicly apologized.[34] However, as a result, Google disabled user moderation on Map Maker, and on May 12, disabled editing worldwide until it could devise a new policy for approving edits and avoiding vandalism.[35]

On April 29, 2015, users of the classic Google Maps were forwarded to the new Google Maps with the option to be removed from the interface.[36]

On July 14, 2015, the Chinese name for Scarborough Shoal was removed after a petition from the Philippines was posted on Change.org.[37]

2016–2018

[edit]

On June 27, 2016, Google rolled out new satellite imagery worldwide sourced from Landsat 8, comprising over 700 trillion pixels of new data.[38] In September 2016, Google Maps acquired mapping analytics startup Urban Engines.[39]

In 2016, the Government of South Korea offered Google conditional access to the country's geographic database – access that already allows indigenous Korean mapping providers high-detail maps. Google declined the offer, as it was unwilling to accept restrictions on reducing the quality around locations the South Korean Government felt were sensitive (see restrictions on geographic data in South Korea).[40]

On October 16, 2017, Google Maps was updated with accessible imagery of several planets and moons such as Titan, Mercury, and Venus, as well as direct access to imagery of the Moon and Mars.[41][42]

In May 2018, Google announced major changes to the API structure starting June 11, 2018. This change consolidated the 18 different endpoints into three services and merged the basic and premium plans into one pay-as-you-go plan.[43] This meant a 1400% price raise for users on the basic plan, with only six weeks of notice. This caused a harsh reaction within the developers community.[44] In June, Google postponed the change date to July 16, 2018.

In August 2018, Google Maps designed its overall view (when zoomed out completely) into a 3D globe dropping the Mercator projection that projected the planet onto a flat surface.[45]

2019–present

[edit]
Google Maps icon 2020
2020 icon redesign

In January 2019, Google Maps added speed trap and speed camera alerts as reported by other users.[46][47]

On October 17, 2019, Google Maps was updated to include incident reporting, resembling a functionality in Waze which was acquired by Google in 2013.[48]

In December 2019, Incognito mode was added, allowing users to enter destinations without saving entries to their Google accounts.[49]

In February 2020, Maps received a 15th anniversary redesign.[50] It notably added a brand-new app icon, which now resembles the original icon in 2005.

On September 23, 2020, Google announced a COVID-19 Layer update for Google maps, which is designed to offer a seven-day average data of the total COVID-19-positive cases per 100,000 people in the area selected on the map. It also features a label indicating the rise and fall in the number of cases.[51]

In January 2021, Google announced that it would be launching a new feature displaying COVID-19 vaccination sites.[52]

In January 2021, Google announced updates to the route planner that would accommodate drivers of electric vehicles. Routing would take into account the type of vehicle, vehicle status including current charge, and the locations of charging stations.[53]

In June 2022, Google Maps added a layer displaying air quality for certain countries.[54]

In September 2022, Google removed the COVID-19 Layer from Google Maps due to lack of usage of the feature.[55]

Functionality

[edit]

Directions and transit

[edit]

Google Maps provides a route planner,[56] allowing users to find available directions through driving, public transportation, walking, or biking.[57] Google has partnered globally with over 800 public transportation providers to adopt GTFS (General Transit Feed Specification), making the data available to third parties.[58][59] The app can indicate users' transit route, thanks to an October 2019 update. The incognito mode, eyes-free walking navigation features were released earlier.[60] A July 2020 update provided bike share routes.[61]

In February 2024, Google Maps started rolling out glanceable directions for its Android and iOS apps. The feature allows users to track their journey from their device's lock screen.[62][63]

Traffic conditions

[edit]
Screenshot of Google Maps with traffic option enabled
Screenshot of Google Maps with traffic option enabled

In 2007, Google began offering traffic data as a colored overlay on top of roads and motorways to represent the speed of vehicles on particular roads. Crowdsourcing is used to obtain the GPS-determined locations of a large number of cellphone users, from which live traffic maps are produced.[64][65][66]

Google has stated that the speed and location information it collects to calculate traffic conditions is anonymous.[67] Options available in each phone's settings allow users not to share information about their location with Google Maps.[68] Google stated, "Once you disable or opt out of My Location, Maps will not continue to send radio information back to Google servers to determine your handset's approximate location".[69][failed verification]

Street View

[edit]
A Google Maps car at Googleplex, Mountain View

On May 25, 2007, Google released Google Street View, a feature of Google Maps providing 360° panoramic street-level views of various locations. On the date of release, the feature only included five cities in the U.S. It has since expanded to thousands of locations around the world. In July 2009, Google began mapping college campuses and surrounding paths and trails.

Street View garnered much controversy after its release because of privacy concerns about the uncensored nature of the panoramic photographs, although the views are only taken on public streets.[70][71] Since then, Google has blurred faces and license plates through automated facial recognition.[72][73][74]

Google Maps Street View Trekker backpack being implemented on the sidewalk of the Hudson River Greenway in New York City

In late 2014, Google launched Google Underwater Street View, including 2,300 kilometres (1,400 mi) of the Australian Great Barrier Reef in 3D. The images are taken by special cameras which turn 360 degrees and take shots every 3 seconds.[75]

In 2017, in both Google Maps and Google Earth, Street View navigation of the International Space Station interior spaces became available.

3D imagery

[edit]

Google Maps has incorporated[when?] 3D models of hundreds of cities in over 40 countries from Google Earth into its satellite view. The models were developed using aerial photogrammetry techniques.[76][77]

Immersive View

[edit]

At the I/O 2022 event, Google announced Immersive View, a feature of Google Maps which would involve composite 3D images generated from Street View and aerial images of locations using AI, complete with synchronous information. It was to be initially in five cities worldwide, with plans to add it to other cities later on.[78] The feature was previewed in September 2022 with 250 photorealistic aerial 3D images of landmarks,[79] and was full launched in February 2023.[80] An expansion of Immersive View to routes was announced at Google I/O 2023,[81] and was launched in October 2023 for 15 cities globally.[82]

The feature uses predictive modelling and neural radiance fields to scan Street View and aerial images to generate composite 3D imagery of locations, including both exteriors and interiors, and routes, including driving, walking or cycling, as well as generate synchronous information and forecasts up to a month ahead from historical and environmental data about both such as weather, traffic and busyness.

Immersive View has been available in the following locations:[citation needed]

Locations with Immersive View
Country Locations
 Argentina Buenos Aires
 Australia Melbourne, Sydney
 Austria Vienna
 Belgium Brussels
 Brazil Brasília, Rio de Janeiro, São Paulo
 Canada Calgary, Edmonton, Montreal, Ottawa, Toronto, Vancouver
 Chile Santiago
 Czech Republic Prague
 France Nice, Paris
 Germany Berlin, Cologne, Frankfurt, Munich
 Greece Athens
 Hong Kong Hong Kong
 Hungary Budapest
 Italy Florence, Milan, Rome, Venice
 Japan Kyoto, Nagoya, Osaka, Tokyo
 Mexico Guadalajara, Mexico City
 Netherlands Amsterdam
 Norway Oslo
 Poland Warsaw
 Portugal Lisbon, Porto
 Romania Bucharest
 Singapore Singapore
 South Africa Cape Town, Johannesburg
 Spain Barcelona, Madrid
 Sweden Stockholm
  Switzerland Zurich
 Taiwan Taichung, Taipei
 United Kingdom Edinburgh, London
 United States Atlanta, Boston, Chicago, Detroit, Houston, Las Vegas, Los Angeles, Miami, New York City, Philadelphia, San Diego, San Francisco, Seattle
 Vatican City Vatican City

Landmark Icons

[edit]

Google added icons of city attractions, in a similar style to Apple Maps, on October 3, 2019. In the first stage, such icons were added to 9 cities.[83]

45° imagery

[edit]
An example of the Leaning Tower of Pisa in the 45° view

In December 2009, Google introduced a new view consisting of 45° angle aerial imagery, offering a "bird's-eye view" of cities. The first cities available were San Jose and San Diego. This feature was initially available only to developers via the Google Maps API.[84] In February 2010, it was introduced as an experimental feature in Google Maps Labs.[85] In July 2010, 45° imagery was made available in Google Maps in select cities in South Africa, the United States, Germany and Italy.[86]

Weather

[edit]

In February 2024, Google Maps incorporated a small weather icon on the top left corner of the Android and iOS mobile apps, giving access to weather and air quality index details.[87]

Lens in Maps

[edit]

Previously called Search with Live View, Lens In Maps identifies shops, restaurants, transit stations and other street features with a phone's camera and places relevant information and a category pin on top, like closing/opening times, current busyness, pricing and reviews using AI and augmented reality. The feature, if available on the device, can be accessed through tapping the Lens icon in the search bar. It was expanded to 50 new cities in October 2023 in its biggest expansion yet, after initially being released in late 2022 in Los Angeles, San Francisco, New York, London, and Paris.[88][89] Lens in Maps shares features with Live View, which also displays information relating to street features while guiding a user to a selected destination with virtual arrows, signs and guidance.[90]

Business listings

[edit]
A business listing in Google Maps showing opening times, reviews and photos. This screenshot is from the Android mobile app.

Google collates business listings from multiple on-line and off-line sources. To reduce duplication in the index, Google's algorithm combines listings automatically based on address, phone number, or geocode,[91] but sometimes information for separate businesses will be inadvertently merged with each other, resulting in listings inaccurately incorporating elements from multiple businesses.[92] Google allows business owners to create and verify their own business data through Google Business Profile (GBP), formerly Google My Business (GMB).[93] Owners are encouraged to provide Google with business information including address, phone number, business category, and photos.[94] Google has staff in India who check and correct listings remotely as well as support businesses with issues.[95] Google also has teams on the ground in most countries that validate physical addresses in person.[96] In May 2024, Google announced it would discontinue the chat feature in Google Business Profile. Starting July 15, 2024, new chat conversations would be disabled, and by July 31, 2024, all chat functionalities would end.[97]

Google Maps can be manipulated by businesses that are not physically located in the area in which they record a listing. There are cases of people abusing Google Maps to overtake their competition by placing unverified listings on online directory sites, knowing the information will roll across to Google (duplicate sites). The people who update these listings do not use a registered business name. They place keywords and location details on their Google Maps business title, which can overtake credible business listings. In Australia in particular, genuine companies and businesses are noticing a trend of fake business listings in a variety of industries.[98]

Genuine business owners can also optimize their business listings to gain greater visibility in Google Maps, through a type of search engine marketing called local search engine optimization.[99]

Indoor maps

[edit]

In March 2011, indoor maps were added to Google Maps, giving users the ability to navigate themselves within buildings such as airports, museums, shopping malls, big-box stores, universities, transit stations, and other public spaces (including underground facilities). Google encourages owners of public facilities to submit floor plans of their buildings in order to add them to the service.[100] Map users can view different floors of a building or subway station by clicking on a level selector that is displayed near any structures which are mapped on multiple levels.

My Maps

[edit]
Google My Maps

My Maps is a feature in Google Maps launched in April 2007 that enables users to create custom maps for personal use or sharing. Users can add points, lines, shapes, notes and images on top of Google Maps using a WYSIWYG editor.[101] An Android app for My Maps, initially released in March 2013 under the name Google Maps Engine Lite, was available until its removal from the Play Store in October 2021.[102][103][104]

Google Local Guides

[edit]

Google Local Guides is a volunteer program launched by Google Maps[105] to enable users to contribute to Google Maps when registered. It sometimes provides them additional perks and benefits for their collaboration. Users can achieve Level 1 to 10, and be awarded with badges. The program is partially a successor to Google Map Maker as features from the former program became integrated into the website and app.[106]

The program consists of adding reviews, photos, basic information, and videos; and correcting information such as wheelchair accessibility.[107][108] Adding reviews, photos, videos, new places, new roads or providing useful information gives points to the users.[109] The level of users is upgraded when they get a certain amount of points.[110][111] Starting with Level 4, a star is shown near the avatar of the user.[111]

Timelapse

[edit]

Earth Timelapse, released in April 2021, is a program in which users can see how the earth has been changed in the last 37 years. They combined the 15 million satellite images (roughly ten quadrillion pixels) to create the 35 global cloud-free Images for this program.[112]

Timeline

[edit]

If a user shares their location with Google, Timeline summarises this location for each day on a Timeline map.[113] Timeline estimates the mode of travel used to move between places and will also show photos taken at that location. In June 2024, Google started progressively removing access to the timeline on web browsers, with the information instead being stored on a local device.[114][115]

Implementation

[edit]
A split-view screenshot of Google Maps. In the bottom half Street Maps is shown, while in the top half Street View is shown. A user can zoom in and out of either of them independently of the zoom level of each.

As the user drags the map, the grid squares are downloaded from the server and inserted into the page. When a user searches for a business, the results are downloaded in the background for insertion into the side panel and map; the page is not reloaded. A hidden iframe with form submission is used because it preserves browser history. Like many other Google web applications, Google Maps uses JavaScript extensively.[116] The site also uses protocol buffers for data transfer rather than JSON, for performance reasons.

The version of Google Street View for classic Google Maps required Adobe Flash.[117] In October 2011, Google announced MapsGL, a WebGL version of Maps with better renderings and smoother transitions.[118] Indoor maps use JPG, .PNG, .PDF, .BMP, or .GIF, for floor plans.[119]

Users who are logged into a Google Account can save locations so that they are overlaid on the map with various colored "pins" whenever they browse the application. These "Saved places" can be organized into default groups or user named groups and shared with other users. "Starred places" is one default group example. It previously automatically created a record within the now-discontinued product Google Bookmarks.

Map data and imagery

[edit]

The Google Maps terms and conditions[120] state that usage of material from Google Maps is regulated by Google Terms of Service[121] and some additional restrictions. Google has either purchased local map data from established companies, or has entered into lease agreements to use copyrighted map data.[122] The owner of the copyright is listed at the bottom of zoomed maps. For example, street maps in Japan are leased from Zenrin. Street maps in China are leased from AutoNavi.[123] Russian street maps are leased from Geocentre Consulting and Tele Atlas. Data for North Korea is sourced from the companion project Google Map Maker.

Street map overlays, in some areas, may not match up precisely with the corresponding satellite images. The street data may be entirely erroneous, or simply out of date: "The biggest challenge is the currency of data, the authenticity of data," said Google Earth representative Brian McClendon. As a result, in March 2008 Google added a feature to edit the locations of houses and businesses.[124][125]

Restrictions have been placed on Google Maps through the apparent censoring of locations deemed potential security threats. In some cases the area of redaction is for specific buildings, but in other cases, such as Washington, D.C.,[126] the restriction is to use outdated imagery.

Google Maps API

[edit]

Google Maps API, now called Google Maps Platform, hosts about 17 different APIs, which are themed under the following categories: Maps, Places and Routes.[127]

After the success of reverse-engineered mashups such as chicagocrime.org and housingmaps.com, Google launched the Google Maps API in June 2005[128] to allow developers to integrate Google Maps into their websites. It was a free service that did not require an API key until June 2018 (changes went into effect on July 16), when it was announced that an API key linked to a Google Cloud account with billing enabled would be required to access the API.[129] The API currently does not contain ads, but Google states in their terms of use that they reserve the right to display ads in the future.[130]

By using the Google Maps API, it is possible to embed Google Maps into an external website, onto which site-specific data can be overlaid.[131] Although initially only a JavaScript API, the Maps API was expanded to include an API for Adobe Flash applications (but this has been deprecated), a service for retrieving static map images, and web services for performing geocoding, generating driving directions, and obtaining elevation profiles. Over 1,000,000[132] web sites use the Google Maps API, making it the most heavily used web application development API.[133] In September 2011, Google announced it would deprecate the Google Maps API for Flash.[134]

The Google Maps API was free for commercial use, provided that the site on which it is being used is publicly accessible and did not charge for access, and was not generating more than 25,000 map accesses a day.[135][136] Sites that did not meet these requirements could purchase the Google Maps API for Business.[137]

As of June 21, 2018, Google increased the prices of the Maps API and requires a billing profile.[138]

Google Maps in China

[edit]

Due to restrictions on geographic data in China, Google Maps must partner with a Chinese digital map provider in order to legally show Chinese map data. Since 2006, this partner has been AutoNavi.[123]

Within China, the State Council mandates that all maps of China use the GCJ-02 coordinate system, which is offset from the WGS-84 system used in most of the world. google.cn/maps (formerly Google Ditu) uses the GCJ-02 system for both its street maps[139] and satellite imagery.[140] google.com/maps also uses GCJ-02 data for the street map, but uses WGS-84 coordinates for satellite imagery,[141] causing the so-called China GPS shift problem.

Frontier alignments also present some differences between google.cn/maps and google.com/maps. On the latter, sections of the Chinese border with India and Pakistan are shown with dotted lines, indicating areas or frontiers in dispute. However, google.cn shows the Chinese frontier strictly according to Chinese claims with no dotted lines indicating the border with India and Pakistan. For example, the South Tibet region claimed by China but administered by India as a large part of Arunachal Pradesh is shown inside the Chinese frontier by google.cn, with Indian highways ending abruptly at the Chinese claim line. Google.cn also shows Taiwan and the South China Sea Islands as part of China. Google Ditu's street map coverage of Taiwan no longer omits major state organs, such as the Presidential Palace, the five Yuans, and the Supreme Court.[142][additional citation(s) needed]

Feature-wise, google.cn/maps does not feature My Maps. On the other hand, while google.cn displays virtually all text in Chinese, google.com/maps displays most text (user-selectable real text as well as those on map) in English.[citation needed] This behavior of displaying English text is not consistent but intermittent – sometimes it is in English, sometimes it is in Chinese. The criteria for choosing which language is displayed are not known publicly.[citation needed]

Criticism and controversies

[edit]

Incorrect location naming

[edit]

There are cases where Google Maps had added out-of-date neighborhood monikers. Thus, in Los Angeles, the name "Brooklyn Heights" was revived from its 1870s usage[143] and "Silver Lake Heights" from its 1920s usage,[144] or mistakenly renamed areas (in Detroit, the neighborhood "Fiskhorn" became "Fishkorn").[145] Because many companies utilize Google Maps data, these previously obscure or incorrect names then gain traction; the names are often used by realtors, hotels, food delivery sites, dating sites, and news organizations.

Google has said it created its maps from third-party data, public sources, satellites, and users, but many names used have not been connected to any official record.[143][145] According to a former Google Maps employee (who was not authorized to speak publicly), users can submit changes to Google Maps, but some submissions are ruled upon by people with little local knowledge of a place, such as contractors in India. Critics maintain that names likes "BoCoCa" (for the area in Brooklyn between Boerum Hill, Cobble Hill and Carroll Gardens), are "just plain puzzling" or simply made up.[145] Some names used by Google have been traced to non-professionally made maps with typographical errors that survived on Google Maps.[145]

Potential misuse

[edit]

In 2005 the Australian Nuclear Science and Technology Organisation (ANSTO) complained about the potential for terrorists to use the satellite images in planning attacks, with specific reference to the Lucas Heights nuclear reactor; however, the Australian Federal government did not support the organization's concern. At the time of the ANSTO complaint, Google had colored over some areas for security (mostly in the U.S.), such as the rooftop of the White House and several other Washington, D.C. buildings.[146][147][148]

In October 2010, Nicaraguan military commander Edén Pastora stationed Nicaraguan troops on the Isla Calero (in the delta of the San Juan River), justifying his action on the border delineation given by Google Maps. Google has since updated its data which it found to be incorrect.[149]

On January 27, 2014, documents leaked by Edward Snowden revealed that the NSA and the GCHQ intercepted Google Maps queries made on smartphones, and used them to locate the users making these queries. One leaked document, dating to 2008, stated that "[i]t effectively means that anyone using Google Maps on a smartphone is working in support of a GCHQ system."[150]

In May 2015, searches on Google Maps for offensive racial epithets for African Americans such as "nigger", "nigger king", and "nigger house" pointed the user to the White House; Google apologized for the incident.[151][152]

In December 2015, 3 Japanese netizens were charged with vandalism after they were found to have added an unrelated law firm's name as well as indecent names to locations such as "Nuclear test site" to the Atomic Bomb Dome and "Izumo Satya" to the Izumo Taisha.[153][154]

In February 2020, the artist Simon Weckert[155] used 99 cell phones to fake a Google Maps traffic jam.[156]

In September 2024, several schools in Taiwan and Hong Kong were altered to incorrect labels, such as "psychiatric hospitals" or "prisons". Initially, it was believed to be the result of hacker attacks. However, police later revealed that local students had carried out the prank. Google quickly corrected the mislabeled entries. Education officials in Taiwan and Hong Kong expressed concern over the incident.[157][158][159]

Misdirection incidents

[edit]

Australia

[edit]

In August 2023, a woman driving from Alice Springs to the Harts Range Racecourse was stranded in the Central Australian desert for a night after following directions provided by Google Maps.[160][161] She later discovered that Google Maps was providing directions for the actual Harts Range instead of the rodeo. Google said it was looking into the naming of the two locations and consulting with "local and authoritative sources" to solve the issue.[160]

In February 2024, two German tourists were stranded for a week after Google Maps directed them to follow a dirt track through Oyala Thumotang National Park and their vehicle became trapped in mud.[162][163] Queensland Parks and Wildlife Service ranger Roger James said, "People should not trust Google Maps when they're travelling in remote regions of Queensland, and they need to follow the signs, use official maps or other navigational devices."[162]

North America

[edit]

In June 2019, Google Maps provided nearly 100 Colorado drivers an alternative route that led to a dirt road after a crash occurred on Peña Boulevard. The road had been turned to mud by rain, resulting in nearly 100 vehicles being trapped.[164][161] Google said in a statement, "While we always work to provide the best directions, issues can arise due to unforeseen circumstances such as weather. We encourage all drivers to follow local laws, stay attentive, and use their best judgment while driving."[164]

In September 2023, Google was sued by a North Carolina resident who alleged that Google Maps had directed her husband over the Snow Creek Bridge in Hickory the year prior, resulting in him drowning. According to the lawsuit, multiple people had notified Google about the state of the bridge, which collapsed in 2013, but Google had not updated the route information and continued to direct users over the bridge.[165][166][161] At the time of the man's death, the barriers placed to block access to the bridge had been vandalized.[167][168]

In November 2023, a hiker was rescued by helicopter on the backside of Mount Fromme in Vancouver. North Shore Rescue stated on its Facebook page that the hiker had followed a non-existent hiking trail on Google Maps. This was also the second hiker in two months to require rescuing after following the same trail. The fake trail has since been removed from the app.[169][170]

Also in November 2023, Google apologized after users were directed through desert roads after parts of Interstate 15 were closed due to a dust storm.[171] Drivers became stranded after following the suggested detour route, which was a "bumpy dirt trail".[172] Following the incident, Google stated that Google Maps would "no longer route drivers traveling between Las Vegas and Barstow down through those roads."[171]

Russia

[edit]

In 2020, a teenage motorist was found frozen to death while his passenger was still alive but suffered from severe frostbite after using Google Maps, which had led them to a shorter but abandoned section of the R504 Kolyma Highway, where their Toyota Chaser became disabled.[173]

India

[edit]

In 2024, three men from Uttar Pradesh died after their car fell from an under-construction bridge. They were using Google Maps for driving which misdirected them and the car fell into the Ramganga river.[174][175]

Renaming of the Gulf of Mexico

[edit]

In February 2025, as a response to Donald Trump's Executive Order 14172, the Gulf of Mexico was renamed to "Gulf of America" for US users and "Gulf of Mexico (Gulf of America)" elsewhere, except for Mexico itself where it remained the Gulf of Mexico. The decision received criticism, with Mexican president Claudia Sheinbaum asking Google to reconsider its decision.[176] Google subsequently blocked and deleted negative reviews of the gulf after the name change occurred.[177][178]

Discontinued features

[edit]

Google Latitude

[edit]

Google Latitude was a feature that let users share their physical locations with other people. This service was based on Google Maps, specifically on mobile devices. There was an iGoogle widget for desktops and laptops as well.[179] Some concerns were expressed about the privacy issues raised by the use of the service.[180] On August 9, 2013, this service was discontinued,[181] and on March 22, 2017, Google incorporated the features from Latitude into the Google Maps app.[182]

Google Map Maker

[edit]

In areas where Google Map Maker was available, for example, much of Asia, Africa, Latin America and Europe as well as the United States and Canada, anyone who logged into their Google account could directly improve the map by fixing incorrect driving directions, adding biking trails, or adding a missing building or road. General map errors in Australia, Austria, Belgium, Denmark, France, Liechtenstein, Netherlands, New Zealand, Norway, South Africa, Switzerland, and the United States could be reported using the Report a Problem link in Google Maps and would be updated by Google.[183] For areas where Google used Tele Atlas data, map errors could be reported using Tele Atlas map insight.[184]

If imagery was missing, outdated, misaligned, or generally incorrect, one could notify Google through their contact request form.[185]

In November 2016, Google announced the discontinuation of Google Map Maker as of March 2017.[186]

Mobile app

[edit]
Screenshot
Screenshot of Google Maps on Android 14
Android 25.10.04 (Build 732665141) / 7 March 2025; 10 days ago (2025-03-07)[187][188]
Wear OS 25.09.00 (Build 730474011) / 25 February 2025; 20 days ago (2025-02-25)[187][189]
iOS 25.10.02 / 7 March 2025; 10 days ago (2025-03-07)[190]
Android Go,[a] discontinued 161.1 / 13 October 2023; 17 months ago (2023-10-13)[191][192]
Android (Beta) 11.143.0303 / 20 August 2024; 6 months ago (2024-08-20)[193]
Google Maps
Original author(s) Google
Initial release 2006; 19 years ago (2006)
 
Stable release(s) [±]
 
Preview release(s) [±]
 
 
Operating system
Formerly: Java ME, Symbian, Windows Mobile

Google Maps is available as a mobile app for the Android and iOS mobile operating systems. The first mobile version of Google Maps (then known as Google Local for Mobile) was launched in beta in November 2005 for mobile platforms supporting J2ME.[194][195][196] It was released as Google Maps for Mobile in 2006.[197] In 2007 it came preloaded on the first iPhone in a deal with Apple.[198] A version specifically for Windows Mobile was released in February 2007[199] and the Symbian app was released in November 2007.[200]

Version 2.0 of Google Maps Mobile was announced at the end of 2007, with a stand out My Location feature to find the user's location using the cell towers, without needing GPS.[201][202][203] In September 2008, Google Maps was released for and preloaded on Google's own new platform Android.[204][205]

Up until iOS 6, the built-in maps application on the iOS operating system was powered by Google Maps. However, with the announcement of iOS 6 in June 2012, Apple announced that they had created their own Apple Maps mapping service,[206] which officially replaced Google Maps when iOS 6 was released on September 19, 2012.[207] However, at launch, Apple Maps received significant criticism from users due to inaccuracies, errors and bugs.[208][209] One day later, The Guardian reported that Google was preparing its own Google Maps app,[210] which was released on December 12, 2012.[211][212] Within two days, the application had been downloaded over ten million times.[213]

Features

[edit]

The Google Maps apps for iOS and Android have many of the same features, including turn-by-turn navigation, street view, and public transit information.[214][215] Turn-by-turn navigation was originally announced by Google as a separate beta testing app exclusive to Android 2.0 devices in October 2009.[216][217] The original standalone iOS version did not support the iPad,[215] but tablet support was added with version 2.0 in July 2013.[218] An update in June 2012 for Android devices added support for offline access to downloaded maps of certain regions,[219][220] a feature that was eventually released for iOS devices, and made more robust on Android, in May 2014.[221][222]

At the end of 2015 Google Maps announced its new offline functionality,[223] but with various limitations – downloaded area cannot exceed 120,000 square kilometers[224][225] and require a considerable amount of storage space.[226] In January 2017, Google added a feature exclusively to Android that will, in some U.S. cities, indicate the level of difficulty in finding available parking spots,[227] and on both Android and iOS, the app can, as of an April 2017 update, remember where users parked.[228][229] In August 2017, Google Maps for Android was updated with new functionality to actively help the user in finding parking lots and garages close to a destination.[230] In December 2017, Google added a new two-wheeler mode to its Android app, designed for users in India, allowing for more accessibility in traffic conditions.[231][232] In 2019 the Android version introduced the new feature called live view that allows to view directions directly on the road thanks to augmented reality.[233] Google Maps won the 2020 Webby Award for Best User Interface in the category Apps, Mobile & Voice.[234] In March 2021, Google added a feature in which users can draw missing roads.[235] In June 2022, Google implemented support for toll calculation. Both iOS and Android apps report how much the user has to pay in tolls when a route that includes toll roads is input. The feature is available for roads in the US, India, Japan and Indonesia with further expansion planned. As per reports the total number of toll roads covered in this phase is around 2000.[236]

Reception

[edit]

USA Today welcomed the application back to iOS, saying: "The reemergence in the middle of the night of a Google Maps app for the iPhone is like the return of an old friend. Only your friend, who'd gone missing for three months, comes back looking better than ever."[237] Jason Parker of CNET, calling it "the king of maps", said, "With its iOS Maps app, Google sets the standard for what mobile navigation should be and more."[238] Bree Fowler of the Associated Press compared Google's and Apple's map applications, saying: "The one clear advantage that Apple has is style. Like Apple devices, the maps are clean and clear and have a fun, pretty element to them, especially in 3-D. But when it comes down to depth and information, Google still reigns superior and will no doubt be welcomed back by its fans."[239] Gizmodo gave it a ranking of 4.5 stars, stating: "Maps Done Right".[240] According to The New York Times, Google "admits that it's [iOS app is] even better than Google Maps for Android phones, which has accommodated its evolving feature set mainly by piling on menus".[241]

Google Maps' location tracking is regarded by some as a threat to users' privacy, with Dylan Tweney of VentureBeat writing in August 2014 that "Google is probably logging your location, step by step, via Google Maps", and linked users to Google's location history map, which "lets you see the path you've traced for any given day that your smartphone has been running Google Maps". Tweney then provided instructions on how to disable location history.[242] The history tracking was also noticed, and recommended disabled, by editors at CNET[243] and TechCrunch.[244] Additionally, Quartz reported in April 2014 that a "sneaky new privacy change" would have an effect on the majority of iOS users. The privacy change, an update to the Gmail iOS app that "now supports sign-in across Google iOS apps, including Maps, Drive, YouTube and Chrome", meant that Google would be able to identify users' actions across its different apps.[245]

The Android version of the app surpassed five billion installations in March 2019.[246] By November 2021, the Android app had surpassed 10 billion installations.[247]

Go version

[edit]

Google Maps Go, a version of the app designed for lower-end devices, was released in beta in January 2018.[248] By September 2018, the app had over 10 million installations.[249]

Artistic and literary uses

[edit]

The German "geo-novel" Senghor on the Rocks (2008) presents its story as a series of spreads showing a Google Maps location on the left and the story's text on the right. Annika Richterich explains that the "satellite pictures in Senghor on the Rocks illustrate the main character's travel through the West-African state of Senegal".[250]

Artists have used Google Street View in a range of ways. Emilio Vavarella's The Google Trilogy includes glitchy images and unintended portraits of the drivers of the Street View cars.[251] The Japanese band group inou used Google Street View backgrounds to make a music video for their song EYE.[252] The Canadian band Arcade Fire made a customized music video that used Street View to show the viewer their own childhood home.[253][254]

See also

[edit]

Notes

[edit]
  1. ^ Lite version for Android

References

[edit]
  1. ^ "Google Maps Metrics and Infographics". Google Maps for iPhone. Archived from the original on March 21, 2022. Retrieved April 1, 2021.
  2. ^ "Our history in depth". Google Company. Archived from the original on April 6, 2016. Retrieved June 13, 2016.
  3. ^ "Google Maps API". Google Developers. Archived from the original on April 20, 2012.
  4. ^ Perez, Sarah (November 8, 2016). "Google to shut down Map Maker, its crowdsourced map editing tool". TechCrunch. Archived from the original on August 11, 2017. Retrieved June 23, 2017.
  5. ^ "Blurry or outdated imagery". Google Earth Help. Archived from the original on October 24, 2013. Retrieved January 12, 2014.
  6. ^ Anderson, Frank (October 18, 2011). "How Often is Google Maps and Google Earth Updated?". TechnicaMix. Archived from the original on December 3, 2013. Retrieved November 24, 2013.
  7. ^ "Map Types – Google Maps JavaScript API v3 — Google Developers". Google Inc. July 27, 2012. Archived from the original on January 15, 2013. Retrieved January 3, 2013.
  8. ^ "Google+ Smartphone App Popularity". Business Insider. Archived from the original on September 6, 2013. Retrieved September 6, 2013.
  9. ^ "Google mapper: Take browsers to the limit". CNET. Archived from the original on October 26, 2012. Retrieved January 3, 2013.
  10. ^ Kiss, Jemima (June 17, 2009). "Secrets of a nimble giant". London: Guardian. Archived from the original on February 19, 2014. Retrieved October 30, 2010.
  11. ^ Orlowski, Andrew (October 28, 2004). "Google buys CIA-backed mapping startup". The Register. Archived from the original on February 11, 2017. Retrieved April 27, 2017.
  12. ^ Bazeley, Michael (March 30, 2005). "Google acquires traffic info start-up Zipdash". SiliconBeat. Archived from the original on January 2, 2008. Retrieved January 8, 2008.
  13. ^ Taylor, Bret (February 8, 2005). "Mapping your way". Official Google Blog. Archived from the original on May 25, 2010. Retrieved January 12, 2010.
  14. ^ "Google accused of airbrushing Katrina history". NBC News. The Associated Press. March 30, 2007. Archived from the original on November 3, 2020. Retrieved April 17, 2020.
  15. ^ Claburn, Thomas (April 2, 2007). "Google Restores Katrina's Scars To Google Earth". Information Week. Archived from the original on August 19, 2009.
  16. ^ "Google Maps User Guide". Google Maps. Archived from the original on November 5, 2007. Retrieved November 21, 2021.
  17. ^ "Google Announces Launch of Google Maps for Mobile With "My Location" Technology". News from Google. November 28, 2007. Archived from the original on April 26, 2017. Retrieved April 25, 2017.
  18. ^ a b Marshall, Matt (November 28, 2007). "Google releases useful "my location" feature for cellphones". VentureBeat. Archived from the original on April 25, 2017. Retrieved April 25, 2017.
  19. ^ a b Schonfeld, Erick (November 28, 2007). "Google Mobile Maps PinPoints Your Location Without GPS". TechCrunch. AOL. Archived from the original on April 26, 2017. Retrieved April 25, 2017.
  20. ^ Ray, Bill (November 29, 2007). "Google Maps Mobile knows where you are". The Register. Situation Publishing. Archived from the original on October 25, 2020. Retrieved April 25, 2017.
  21. ^ Mills, Elinor (November 28, 2007). "Google Maps for Mobile adds 'My Location' feature". CNET. CBS Interactive. Archived from the original on October 29, 2020. Retrieved April 25, 2017.
  22. ^ Overbo, Mike (November 28, 2007). "Google Maps: My Location". iMore. Archived from the original on April 26, 2017. Retrieved April 25, 2017.
  23. ^ Vanlerberghe, Mac (September 23, 2008). "Google on Android". Google Mobile Blog. Archived from the original on December 7, 2017. Retrieved April 30, 2017.
  24. ^ Tseng, Erick (September 23, 2008). "The first Android-powered phone". Official Google Blog. Archived from the original on December 7, 2017. Retrieved April 30, 2017.
  25. ^ "Google Replaces Tele Atlas Data in US with Google StreetView Data". blumenthals.com. October 12, 2009. Archived from the original on October 15, 2009.
  26. ^ "France Convicts Google for Its Free(dom)". NBC San Diego. February 3, 2012. Archived from the original on October 18, 2014. Retrieved October 13, 2014.
  27. ^ "France: Google wins court decision vs Evermaps". November 29, 2015. Archived from the original on October 22, 2020. Retrieved November 21, 2018.
  28. ^ "Google begins mapping UK rivers". The Daily Telegraph. June 19, 2012. Archived from the original on June 19, 2012. Retrieved June 20, 2012.cite web: CS1 maint: bot: original URL status unknown (link)
  29. ^ "Google Maps for iOS Hits Apple App Store". PCMag. Archived from the original on December 16, 2012. Retrieved December 12, 2012.
  30. ^ Sieczkowski, Cavan (January 29, 2013). "Google Maps North Korea: Prison Camps, Nuclear Complexes Pinpointed In New Images (PHOTOS)". The Huffington Post. Archived from the original on February 9, 2013. Retrieved May 20, 2013.
  31. ^ "Google changes Palestinian location from 'Territories' to 'Palestine'". Fox News. Associated Press. May 3, 2013. Archived from the original on May 21, 2013. Retrieved May 20, 2013.
  32. ^ Google Maps Drops Wikipedia Layer Archived October 6, 2013, at the Wayback Machine. Search Engine Roundtable. (September 10, 2013)
  33. ^ "Google Maps Displays Crimean Border Differently In Russia, U.S." NPR.org. April 12, 2014. Archived from the original on November 26, 2014. Retrieved December 4, 2014.
  34. ^ Hern, Alex (April 24, 2015). "Google Maps hides an image of the Android robot urinating on Apple". The Guardian. Archived from the original on May 17, 2015. Retrieved May 22, 2015.
  35. ^ Kanakarajan, Pavithra (May 22, 2015). "Map Maker will be temporarily unavailable for editing starting May 12, 2015". Google Product Forums. Archived from the original on December 3, 2018. Retrieved May 10, 2015.
  36. ^ "Google Is Getting Rid of Classic Maps for Good (Ugh.)". April 29, 2015. Archived from the original on May 15, 2015. Retrieved May 14, 2015.
  37. ^ "Google Maps alters disputed South China Sea shoal name". BBC News. July 14, 2015. Archived from the original on July 14, 2015. Retrieved July 14, 2015.
  38. ^ Meyer, Robinson (June 27, 2016). "Google's Satellite Map Gets a 700-Trillion-Pixel Makeover". The Atlantic. Archived from the original on June 27, 2016. Retrieved June 27, 2016.
  39. ^ Heater, Brian (September 16, 2016). "Google Maps picks up mapping analytics and visualization startup Urban Engines". TechCrunch. Archived from the original on September 17, 2016. Retrieved September 16, 2016.
  40. ^ Badalge, Keshia; Fairchild, Cullen (February 26, 2018). "One thing North Korea has that the South doesn't: Google Maps". Asia Times. Archived from the original on January 25, 2024. Retrieved March 4, 2021.
  41. ^ Marquardt, Stafford (October 16, 2017). "Space out with planets in Google Maps". Blog.Google. Archived from the original on October 16, 2017. Retrieved October 17, 2017.
  42. ^ Lardinois, Frederic (October 16, 2017). "Google Maps now lets you explore your local planets and moons". TechCrunch. Archived from the original on October 16, 2017. Retrieved October 17, 2017.
  43. ^ Protalinski, Emil (May 2, 2018). "Google Maps Platform arrives with pay-as-you-go billing, free support, and Cloud requirement starting June 11". VentureBeat. Archived from the original on December 4, 2018. Retrieved April 3, 2019.
  44. ^ Singh, Ishveena (May 3, 2018). "Developers up in arms over Google Maps API 'insane' price hike". Geoawesomeness. Archived from the original on August 8, 2020. Retrieved August 7, 2020.
  45. ^ "Google Maps now depicts the Earth as a globe". The Verge. Archived from the original on November 11, 2020. Retrieved August 6, 2018.
  46. ^ "Speed trap warnings begin rolling out to some Google Maps users". Android Central. January 16, 2019. Archived from the original on October 22, 2020. Retrieved January 16, 2019.
  47. ^ "Google Maps shows you how fast you're driving. Here's how". CNET. June 9, 2019. Archived from the original on November 23, 2020. Retrieved June 10, 2019.
  48. ^ "New ways to report driving incidents on Google Maps". Google. October 17, 2019. Archived from the original on October 22, 2020. Retrieved April 20, 2020.
  49. ^ "Updates to Incognito mode and your Timeline in Maps". Google. December 9, 2019. Archived from the original on January 24, 2021. Retrieved April 20, 2020.
  50. ^ "Google Maps is turning 15! Celebrate with a new look and features". Google. February 6, 2020. Archived from the original on January 10, 2021. Retrieved April 20, 2020.
  51. ^ "Navigate safely with new COVID data in Google Maps". Google. September 23, 2020. Archived from the original on December 3, 2020. Retrieved September 23, 2020.
  52. ^ Valinsky, Jordan (January 25, 2021). "Google Maps will soon display Covid-19 vaccination sites". CNN. Archived from the original on January 25, 2021. Retrieved January 25, 2021.
  53. ^ Bogdan, Popa (January 28, 2021). "Google Releases Big Google Maps Update for a Next-Generation Driving Experience". autoevolution. Archived from the original on January 28, 2021. Retrieved January 28, 2021.
  54. ^ "How to use Google Maps to see air quality?". MARCA. June 13, 2022. Archived from the original on June 16, 2022. Retrieved June 16, 2022.
  55. ^ Schoon, Ben (October 26, 2022). "Google Maps has removed its COVID-19 layer". 9to5Google. Archived from the original on April 10, 2023. Retrieved April 10, 2023.
  56. ^ Luckerson, Victor (February 9, 2015). "10 Google Maps Tricks You Need to Know". Time. Archived from the original on February 9, 2015. Retrieved December 7, 2017.
  57. ^ "Get directions and show routes". Google Maps Help. Google Inc. Archived from the original on July 2, 2016. Retrieved December 7, 2017.
  58. ^ Brown, Jessica (September 26, 2017). "Google Maps must improve if it wants cyclists to use it". The Guardian. Archived from the original on December 21, 2020. Retrieved July 12, 2018.
  59. ^ "The Case for Unshackling Transit Data". CityLab. Archived from the original on January 20, 2021. Retrieved July 12, 2018.
  60. ^ "Google Maps may soon get a dark mode and ability to star transit lines". xda-developers. October 10, 2019. Archived from the original on October 20, 2020. Retrieved October 11, 2019.
  61. ^ "Google Maps rolls out end-to-end directions for bikeshare users". TechCrunch. July 20, 2020. Archived from the original on January 26, 2021. Retrieved July 24, 2020.
  62. ^ Li, Abner (February 27, 2024). "Google Maps adds 'Glanceable directions while navigating' setting". 9to5Google. Archived from the original on March 3, 2024. Retrieved May 2, 2024.
  63. ^ Weatherbed, Jess (February 28, 2024). "Google Maps is finally rolling out glanceable directions". The Verge. Archived from the original on May 2, 2024. Retrieved May 2, 2024.
  64. ^ Wang, David (February 28, 2007). "Stuck in traffic?". Archived from the original on February 12, 2017. Retrieved February 13, 2014.
  65. ^ "Real time traffic information with Google Maps". CrackBerry. March 22, 2007. Archived from the original on July 5, 2014. Retrieved June 23, 2014.
  66. ^ Matthews, Susan E. (July 3, 2013). "How Google Tracks Traffic". The Connectivist. Archived from the original on February 22, 2014.
  67. ^ Barth, Dave (August 25, 2009). "The Bright Side of Sitting in Traffic: Crowdsourcing Road Congestion Data". Official Google Blog. Archived from the original on February 4, 2018. Retrieved April 3, 2019.
  68. ^ Matthews, Susan E. (July 3, 2013). "How Google Tracks Traffic". The Connectivist. Archived from the original on February 22, 2014. Retrieved February 13, 2014.
  69. ^ "Help Google Maps find my location". Google Inc. Archived from the original on October 24, 2020. Retrieved December 8, 2016.
  70. ^ "The Google 'ick' factor". July 15, 2007. Archived from the original on August 17, 2009. Retrieved July 9, 2009.
  71. ^ Poulsen, Kevin (July 15, 2007). "Want Off Street View?". Wired. Archived from the original on June 18, 2007.
  72. ^ Petronzio i, Matt (August 22, 2012). "11 Fascinating Facts About Google Maps". Mashable. Archived from the original on April 10, 2015. Retrieved April 3, 2015. Google employs automatic face and license plate blurring technology to protect people's privacy in Street View, and users can even request additional blurring. Aerial imagery provides much less detail and resolution.
  73. ^ "Google begins blurring faces in Street View". May 13, 2008. Archived from the original on June 26, 2011. Retrieved June 11, 2020.
  74. ^ "How Google Street View Became An Art Form". Fast Company. May 25, 2017. Archived from the original on November 25, 2020. Retrieved July 12, 2018.
  75. ^ "Google Launches Underwater Street View". November 16, 2014. Archived from the original on November 29, 2014.
  76. ^ "Explore the world with tour guide and 3D imagery in Google Earth 7". Google LatLong Blog. Archived from the original on January 28, 2016. Retrieved July 24, 2016.
  77. ^ "Google Earth adds new 3D imagery in 21 cities to its 11,000 guided tours of our planet". November 2012. Archived from the original on February 21, 2019. Retrieved July 24, 2016.
  78. ^ "Immersive view coming soon to Maps — plus more updates". May 11, 2022. Archived from the original on May 13, 2022. Retrieved May 13, 2022.
  79. ^ "4 new updates that make Maps look and feel more like the real world". Google. September 28, 2022. Archived from the original on December 27, 2023. Retrieved December 27, 2023.
  80. ^ "New ways Maps is getting more immersive and sustainable". Google. February 8, 2023. Archived from the original on December 27, 2023. Retrieved December 27, 2023.
  81. ^ "New ways AI is making Maps more immersive". Google. May 10, 2023. Archived from the original on December 27, 2023. Retrieved December 27, 2023.
  82. ^ "New Maps updates: Immersive View for routes and other AI features". Google. October 26, 2023. Archived from the original on December 18, 2023. Retrieved December 27, 2023.
  83. ^ "Google Maps Adds Apple-Style Landmark Icons".[permanent dead link]
  84. ^ Wilson, Randy (December 8, 2009). "Google LatLong: Changing your perspective". Google-latlong.blogspot.com. Archived from the original on October 17, 2010. Retrieved September 18, 2010.
  85. ^ Schroeder, Stan (February 12, 2010). "Google Maps Get Labs With 9 Cool New Features". Mashable. Archived from the original on February 16, 2017. Retrieved April 3, 2019.
  86. ^ Axon, Samuel (July 11, 2010). "Google Maps Adds 45° Aerial Imagery For All Users". Mashable. Archived from the original on December 17, 2017. Retrieved April 3, 2019.
  87. ^ "How to check weather & air quality details on Google Maps". The Indian Express. February 7, 2024. Archived from the original on February 8, 2024. Retrieved February 9, 2024.
  88. ^ Li, Abner (November 17, 2022). "Google Maps getting Live View AR search in these cities as Lens adds 'near me' food lookup". 9to5Google. Archived from the original on March 30, 2023. Retrieved January 14, 2024.
  89. ^ "New Maps updates: Lens in Maps and other AI features". Google. October 26, 2023. Archived from the original on January 14, 2024. Retrieved January 14, 2024.
  90. ^ "Use Live View on Google Maps - Android - Google Maps Help". support.google.com. Archived from the original on January 14, 2024. Retrieved January 14, 2024.
  91. ^ "The Google Local map results have "merged" our listing with another in the same building – Maps Help". April 22, 2009. Archived from the original on January 2, 2021. Retrieved January 13, 2010.
  92. ^ "Google Maps Merging Mania Due to Algo-Change". April 29, 2009. Archived from the original on May 3, 2009.
  93. ^ Madrigal, Matt (November 4, 2021). "Connect with local holiday shoppers". Google Ads & Commerce Blog. Google. Archived from the original on January 3, 2018. Retrieved November 4, 2021.
  94. ^ "How to Put Your Business on Google Maps". Spark SEO. June 8, 2020. Archived from the original on October 22, 2020. Retrieved June 23, 2020.
  95. ^ Steele, Adam (April 12, 2020). "How To Contact Google My Business Support Online & By Phone". Loganix. Archived from the original on October 20, 2020. Retrieved June 23, 2020.
  96. ^ Helft, Miguel (November 17, 2009). "Online Maps: Everyman Offers New Directions". New York Times. Archived from the original on March 12, 2017. Retrieved April 27, 2017.
  97. ^ "Changes to Google Business Profile chat - Google Business Profile Help". support.google.com. Archived from the original on May 29, 2024. Retrieved May 29, 2024.
  98. ^ Revell, Timothy (April 7, 2017). "Thousands of fake companies added to Google Maps every month". New Scientist. Archived from the original on April 16, 2017. Retrieved April 15, 2017.
  99. ^ Widewail. "Improve Local SEO With Google Reviews | Widewail". www.widewail.com. Archived from the original on October 30, 2023. Retrieved October 24, 2023.
  100. ^ "Inside Google's Fascinating Stash of 10,000 Indoor Maps". WIRED. Archived from the original on November 12, 2020. Retrieved July 12, 2018.
  101. ^ Marshall, Matt (April 5, 2007). "Google releases My Maps". VentureBeat. Archived from the original on April 4, 2019. Retrieved April 4, 2019.
  102. ^ Lardinois, Frederic (March 27, 2013). "Google Launches Maps Engine Lite, Makes It Easy To Create Advanced Custom Maps". TechCrunch. Archived from the original on April 4, 2019. Retrieved April 4, 2019.
  103. ^ Pratap, Ketan (September 17, 2014). "Google Rebrands Maps Engine to 'My Maps', Adds Improved Search and More". NDTV Gadgets 360. Archived from the original on April 4, 2019. Retrieved April 4, 2019.
  104. ^ "Deprecation of My Maps for Android - My Maps Help". support.google.com. Archived from the original on August 3, 2022. Retrieved August 3, 2022.
  105. ^ "Local Guides". Google. Archived from the original on April 8, 2023. Retrieved August 3, 2023.
  106. ^ "Google Map Maker has closed". Google Map Maker help. Archived from the original on June 19, 2016. Retrieved August 3, 2018.
  107. ^ "Google Testing Video Reviews in Maps". NDTV Gadgets360.com. Archived from the original on October 24, 2020. Retrieved June 20, 2018.
  108. ^ "Google calls on Local Guides to add wheelchair info to Maps". SlashGear. September 7, 2017. Archived from the original on October 26, 2020. Retrieved July 12, 2018.
  109. ^ Southern, Matt G. (May 13, 2022). "Google Local Guides Program: How To Earn Points & Badges". Search Engine Journal. Archived from the original on January 21, 2023. Retrieved January 21, 2023.
  110. ^ "Local Guides". maps.google.com. Archived from the original on February 7, 2019. Retrieved January 21, 2023.
  111. ^ a b "Points, levels, and badging - Local Guides Help". support.google.com. Archived from the original on January 21, 2023. Retrieved January 21, 2023.
  112. ^ "Google Earth's new time travel feature is a gigantic bummer". Trusted Reviews. April 15, 2021. Archived from the original on April 16, 2021. Retrieved April 16, 2021.
  113. ^ Antonelli, William (November 26, 2021). "How to check your Google Maps timeline and see every place you've traveled". Business Insider. Archived from the original on July 3, 2023. Retrieved June 20, 2024.
  114. ^ "Update Google Maps to use Timeline on your device - Computer - Google Maps Help". support.google.com. Archived from the original on June 15, 2024. Retrieved June 15, 2024.
  115. ^ "Google Maps gets rid of another feature on Web". Yahoo Tech. June 5, 2024. Archived from the original on June 15, 2024. Retrieved June 15, 2024.
  116. ^ Gautham, A. S. "Google Revises Their Map, Adds Offline Version and 3D Imaging". TechGau.org. Archived from the original on June 13, 2012. Retrieved June 9, 2012.
  117. ^ Arthur, Charles (March 20, 2009). "Where the streets all have Google's name". The Guardian. Archived from the original on March 5, 2017. Retrieved April 27, 2017.
  118. ^ McClendon, Brian (October 13, 2011). "Step inside the map with Google MapsGL". Official Google Blog. Archived from the original on April 10, 2012. Retrieved April 25, 2012.
  119. ^ "Use indoor maps to view floor plans – Computer". Google Maps Help. Archived from the original on February 27, 2017.
  120. ^ "Google Maps/Google Earth Additional Terms of Service". Google Maps. Archived from the original on February 8, 2010. Retrieved January 13, 2010.
  121. ^ "Google Terms of Service". Google. Archived from the original on January 25, 2012. Retrieved January 13, 2010.
  122. ^ "Legal Notices for Google Maps/Google Earth and Google Maps/Google Earth APIs". Google. Archived from the original on January 19, 2021. Retrieved October 3, 2019.
  123. ^ a b Lee, Mark (July 5, 2012). "Apple Shares Google China Map Partner in Win for Autonavi". Bloomberg News. Archived from the original on October 18, 2014.
  124. ^ "Improve information in Google Maps for the world to see". Google Maps. Archived from the original on December 12, 2007.
  125. ^ Balakrishnan, Ramesh (March 18, 2008). "Google LatLong: It's your world. Map it". Google-latlong.blogspot.com. Archived from the original on December 30, 2009. Retrieved January 13, 2010.
  126. ^ Johnson, Jenna (July 22, 2007). "Google's View of D.C. Melds New and Sharp, Old and Fuzzy". The Washington Post. Archived from the original on February 13, 2011. Retrieved May 3, 2010.
  127. ^ "Google Maps Platform Documentation". Google for Developers. Archived from the original on January 28, 2021. Retrieved June 23, 2020.
  128. ^ Taylor, Bret (June 29, 2005). "The world is your JavaScript-enabled oyster". Official Google Blog. Archived from the original on September 30, 2009.
  129. ^ "User Guide | Google Maps Platform". Google Cloud. Archived from the original on November 12, 2020. Retrieved July 10, 2018.
  130. ^ "Google Maps API – Terms of use". Google. Archived from the original on December 24, 2013.
  131. ^ Rose, Ian (February 12, 2014). "PHP and MySQL: Working with Google Maps". Syntaxxx. Archived from the original on October 18, 2014. Retrieved October 13, 2014.
  132. ^ Hoetmer, Ken (May 15, 2013). "A fresh new look for the Maps API, for all one million sites". Google Maps Platform. Archived from the original on November 28, 2013.
  133. ^ "APIs Dashboard". ProgrammableWeb. Archived from the original on April 30, 2016. Retrieved May 4, 2016.
  134. ^ Eustace, Alan (September 2, 2011). "A fall spring-clean". Official Google Blog. Archived from the original on September 7, 2011. Retrieved September 2, 2011.
  135. ^ "Google Maps API FAQ". Google for Developers. Archived from the original on January 16, 2014.
  136. ^ "Google Maps API FAQ Usage Limits". Google for Developers. Archived from the original on January 16, 2014.
  137. ^ "Google Maps for Business". Google. Archived from the original on December 24, 2013.
  138. ^ "Introducing Google Maps Platform". Google. May 2, 2018. Archived from the original on September 14, 2022. Retrieved September 14, 2022.
  139. ^ Monument to the People's Heroes. "Google China street map uses GCJ-02 coordinates". Archived from the original on May 25, 2017. Retrieved April 8, 2015.
  140. ^ Monument to the People's Heroes. "Google China satellite imagery uses GCJ-02 coordinates". Archived from the original on May 25, 2017. Retrieved April 8, 2015.
  141. ^ Monument to the People's Heroes. "Google.com satellite imagery uses WGS-84 coordinates". Archived from the original on November 18, 2015. Retrieved April 8, 2015.
  142. ^ "Where We've Been & Where We're Headed Next". Archived from the original on September 28, 2017. Retrieved January 2, 2018.
  143. ^ a b Martin, Brittany (September 12, 2018). "Why Is Google Maps Using a 19th Century Name for an L.A. Neighborhood?". Los Angeles. Archived from the original on January 26, 2023. Retrieved January 26, 2023.
  144. ^ "Google Maps' Phantom Neighborhoods Are Confusing Southern Californians. Help Us Keep Track Of Them". September 12, 2018. Archived from the original on January 26, 2023. Retrieved January 26, 2023.
  145. ^ a b c d Nicas, Jack (August 2, 2018). "As Google Maps Renames Neighborhoods, Residents Fume". The New York Times. Archived from the original on January 31, 2023. Retrieved January 24, 2023.
  146. ^ "Blurred Out: 51 Things You Aren't Allowed to See on Google Maps". Archived from the original on July 21, 2009.
  147. ^ "Google Maps: The White House — Elliott C. Back". Elliottback.com. Archived from the original on December 1, 2008. Retrieved August 27, 2010.
  148. ^ Barlow, Karen (August 8, 2005). "Google Earth prompts security fears". ABC News Online. Archived from the original on June 16, 2009. Retrieved November 4, 2013.
  149. ^ Sutter, John D. (November 5, 2010). "Google Maps border becomes part of international dispute". CNN. Archived from the original on May 8, 2012. Retrieved April 25, 2012.
  150. ^ Ball, James (January 28, 2014). "Angry Birds and 'leaky' phone apps targeted by NSA and GCHQ for user data". The Guardian. Archived from the original on March 2, 2014. Retrieved March 3, 2014.
  151. ^ Gibbs, Samuel. Google says sorry over racist Google Maps White House search results Archived April 14, 2022, at the Wayback Machine. The Guardian. Retrieved on 15 April 2022
  152. ^ Fung, Brian (December 6, 2021). "If you search Google Maps for the N-word, it gives you the White House". Washington Post. ISSN 0190-8286. Archived from the original on April 23, 2022. Retrieved September 22, 2023.
  153. ^ "グーグルマップ改ざん、3人書類送検 業務妨害容疑". Asahi Shimbun. March 2, 2024. Archived from the original on December 1, 2015. Retrieved March 2, 2024.
  154. ^ "グーグルマップの施設名を勝手に書き換え 大学生ら3人書類送検". FNN Prime Online. December 1, 2015. Archived from the original on December 4, 2015. Retrieved March 2, 2024.
  155. ^ "Google Maps Hacks". Simon Weckert. Archived from the original on August 6, 2024. Retrieved July 31, 2024.
  156. ^ "An Artist Used 99 Phones to Fake a Google Maps Traffic Jam". Wired. ISSN 1059-1028. Archived from the original on January 17, 2021. Retrieved February 4, 2020.
  157. ^ Liu, Oscar (September 19, 2024). "Hong Kong officials concerned over pranksters renaming schools on Google Maps". South China Morning Post. Archived from the original on November 22, 2024. Retrieved November 18, 2024.
  158. ^ Thomson, Jono (September 23, 2024). "Google says spoofed Taiwan school names being fixed". Taiwan News. Archived from the original on November 29, 2024. Retrieved November 18, 2024.
  159. ^ 聯合新聞網. "Google地圖改校名之亂 警查出是學生惡搞學校". 聯合新聞網 (in Chinese). Archived from the original on December 31, 2024. Retrieved November 18, 2024.
  160. ^ a b Allison, Charmayne; Robinson, Lee; Goetze, Eliza (August 15, 2023). "A student nurse was left stranded in the desert with just an esky of beers after Google Maps led her astray". ABC News Australia. Archived from the original on September 22, 2023. Retrieved September 22, 2023.
  161. ^ a b c "Google sued after man drove off collapsed bridge following map directions". Al Jazeera. September 21, 2023. Archived from the original on January 25, 2024. Retrieved September 22, 2023.
  162. ^ a b Meacham, Savannah (February 21, 2024). "'I wouldn't want to live this again': Tourists lost for a week after Google Maps mishap". The Sydney Morning Herald. Australian Associated Press. Archived from the original on February 27, 2024. Retrieved February 29, 2024.
  163. ^ Lagan, Bernard (February 29, 2024). "Google Maps stranded us with crocodiles and snakes in the outback". The Times. ISSN 0140-0460. Archived from the original on February 23, 2024. Retrieved February 29, 2024.
  164. ^ a b Lou, Michelle (June 26, 2019). "Nearly 100 drivers followed a Google Maps detour – and ended up stuck in an empty field". CNN. Archived from the original on December 13, 2023. Retrieved February 29, 2024.
  165. ^ Brodkin, Jon (September 21, 2023). "Google sued over fatal Google Maps error after man drove off broken bridge". Ars Technica. Archived from the original on September 22, 2023. Retrieved September 22, 2023.
  166. ^ Lynch, Jamiel (September 21, 2023). "Family sues Google alleging its Maps app led father to drive off collapsed bridge to his death, attorneys say". CNN. Archived from the original on January 25, 2024. Retrieved September 22, 2023.
  167. ^ Ritter, Moira (September 20, 2023). "Dad of 2 died after driving off collapsed bridge, family says. Now, Google being sued". The Charlotte Observer. Archived from the original on January 18, 2024. Retrieved January 25, 2024.
  168. ^ Matza, Max (September 21, 2023). "Google accused of directing motorist to drive off collapsed bridge". BBC News. Archived from the original on January 20, 2024. Retrieved January 25, 2024.
  169. ^ Roy, Adam (November 8, 2023). "A Hiker Is Lucky to Be Alive After Following a Fake Trail on Google Maps". Backpacker. Archived from the original on November 10, 2023. Retrieved November 12, 2023.
  170. ^ Holpuch, Amanda (November 12, 2023). "Hikers Rescued After Following Nonexistent Trail on Google Maps". The New York Times. Archived from the original on January 10, 2024. Retrieved February 29, 2024.
  171. ^ a b Wilson, Madeline (November 29, 2023). "Google Maps Is Sorry for Sending L.A.-Bound Drivers on Nightmare Desert Detour". Los Angeles. Archived from the original on December 15, 2023. Retrieved November 29, 2023.
  172. ^ Wu, Daniel (November 28, 2023). "Google apologizes after map led drivers down dirt path into the desert". The Washington Post. Archived from the original on November 29, 2023. Retrieved November 29, 2023.
  173. ^ Stewart, Will (December 11, 2020). "Man frozen to death after Google Maps wrong turn". news.com.au. Archived from the original on July 8, 2021. Retrieved March 2, 2024.
  174. ^ "Google Maps leads three men to death as car plunges from incomplete bridge into river in Bareilly, UP". The Economic Times. November 25, 2024. ISSN 0013-0389. Archived from the original on November 25, 2024. Retrieved November 25, 2024.
  175. ^ "UP News: Google Map Route Leads To Death Of 3, Car Falls Into Ramganga River in Bareilly". English Jagran. November 24, 2024. Archived from the original on November 24, 2024. Retrieved November 25, 2024.
  176. ^ Hauari, Gabe (January 31, 2025). "Mexico's president asks Google not to rename Gulf of Mexico". USA TODAY. Retrieved February 13, 2025.
  177. ^ McMahon, Liv (February 13, 2025). "Google Maps blocks Gulf of America reviews after rename criticism". BBC News. Archived from the original on February 15, 2025. Retrieved February 21, 2025.
  178. ^ Dellinger, A. J. (February 13, 2025). "Google Maps Won't Let You Leave Negative Reviews on the Gulf of America". Gizmodo. Archived from the original on February 15, 2025. Retrieved February 21, 2025.
  179. ^ "See where your friends are with Google Latitude". February 4, 2009. Archived from the original on December 30, 2009.
  180. ^ "Privacy fears over Google tracker". BBC News. February 5, 2009. Archived from the original on February 17, 2009. Retrieved February 16, 2009.
  181. ^ Rodriguez, Salvador (July 11, 2013). "Google Latitude joins long list of products axed by the Web giant". Los Angeles Times. Archived from the original on October 16, 2021. Retrieved March 7, 2024.
  182. ^ "Google Maps will let you share your location with friends and family for a specific period of time". techcrunch.com. March 22, 2017. Archived from the original on March 22, 2017. Retrieved March 22, 2017.
  183. ^ "Fix an error on Google Maps". Google Inc. Archived from the original on October 22, 2020. Retrieved August 11, 2011.
  184. ^ "Tele Atlas Map Insight map feedback". Tele Atlas. Archived from the original on January 12, 2021. Retrieved August 11, 2011.
  185. ^ "Google contact request form". Archived from the original on July 4, 2014. Retrieved October 4, 2014.
  186. ^ "Google Map Maker graduates to Google Maps". Google Map Maker forum. November 8, 2016. Archived from the original on December 3, 2018. Retrieved April 16, 2017.
  187. ^ a b "Google Maps". Google Play. Retrieved March 10, 2025.
  188. ^ "Google Maps 25.10.04.732665141". APKMirror. March 7, 2025. Retrieved March 10, 2025.
  189. ^ "Google Maps (Wear OS) 25.09.00.730474011.W". APKMirror. February 21, 2025. Retrieved March 10, 2025.
  190. ^ "Google Maps". App Store. Retrieved March 10, 2025.
  191. ^ "Google Maps Go". Google Play. Retrieved March 10, 2025.
  192. ^ "Google Maps Go 161.1". APKMirror. October 13, 2023. Retrieved March 10, 2025.
  193. ^ "Google Maps 11.143.0303 beta". APKMirror. August 20, 2024. Retrieved August 24, 2024.
  194. ^ Team, PhoneArena (November 10, 2005). "Google local for Mobile announced". PhoneArena. Archived from the original on October 5, 2023. Retrieved October 4, 2023.
  195. ^ Gohring, Nancy (November 7, 2005). "Google launches downloadable mobile application". Computerworld. Archived from the original on October 5, 2023. Retrieved October 4, 2023.
  196. ^ "Google Local for Mobile". All About Symbian. February 16, 2006. Archived from the original on June 19, 2023. Retrieved October 4, 2023.
  197. ^ "Google Maps". October 16, 2006. Archived from the original on October 16, 2006. Retrieved October 4, 2023.
  198. ^ Welch, Chris (September 29, 2012). "Steve Jobs added Google Maps to the original iPhone just weeks before unveiling". The Verge. Archived from the original on October 30, 2023. Retrieved October 24, 2023.
  199. ^ "Google Maps for now available on Windows Mobile devices". Engadget. July 19, 2019. Archived from the original on October 30, 2023. Retrieved October 24, 2023.
  200. ^ "New native S60 Google Maps for Mobile with GPS support". All About Symbian. October 12, 2007. Archived from the original on October 30, 2023. Retrieved October 24, 2023.
  201. ^ "Google Announces Launch of Google Maps for Mobile With "My Location" Technology". News from Google. November 28, 2007. Archived from the original on April 26, 2017. Retrieved April 25, 2017.
  202. ^ Marshall, Matt (November 28, 2007). "Google releases useful "my location" feature for cellphones". VentureBeat. Archived from the original on April 25, 2017. Retrieved April 25, 2017.
  203. ^ Schonfeld, Erick (November 28, 2007). "Google Mobile Maps PinPoints Your Location Without GPS". TechCrunch. AOL. Archived from the original on April 26, 2017. Retrieved April 25, 2017.
  204. ^ Vanlerberghe, Mac (September 23, 2008). "Google on Android". Google Mobile Blog. Archived from the original on December 7, 2017. Retrieved April 30, 2017.
  205. ^ Tseng, Erick (September 23, 2008). "The first Android-powered phone". Official Google Blog. Archived from the original on December 7, 2017. Retrieved April 30, 2017.
  206. ^ Gates, Sara (June 11, 2012). "Apple Maps App Officially Debuts, Google Maps Dropped (PHOTOS)". HuffPost. AOL. Archived from the original on August 7, 2020. Retrieved April 30, 2017.
  207. ^ Chen, Brian X.; Wingfield, Nick (September 19, 2012). "Apple's iPhone Update Leaves Out Google's Maps". The New York Times. Archived from the original on December 11, 2020. Retrieved April 30, 2017.
  208. ^ "New Apple maps app under fire from users". BBC. September 20, 2012. Archived from the original on December 4, 2016. Retrieved April 30, 2017.
  209. ^ Patel, Nilay (September 20, 2012). "Wrong turn: Apple's buggy iOS 6 maps lead to widespread complaints". The Verge. Vox Media. Archived from the original on December 10, 2016. Retrieved April 30, 2017.
  210. ^ Arthur, Charles (September 20, 2012). "Apple's self-inflicted maps issue is a headache – but don't expect an apology". The Guardian. Archived from the original on November 11, 2020. Retrieved April 30, 2017.
  211. ^ Olanoff, Drew (December 12, 2012). "Google Launches Native Maps For iOS, And Here's The Deep Dive On Navigation, Info Sheets And More". TechCrunch. AOL. Archived from the original on November 12, 2020. Retrieved April 30, 2017.
  212. ^ Bohn, Dieter (December 12, 2012). "Google Maps for iPhone is here: how data and design beat Apple". The Verge. Vox Media. Archived from the original on November 11, 2020. Retrieved April 30, 2017.
  213. ^ Keizer, Gregg (December 18, 2012). "Google Maps snares 10M downloads on iOS App Store". Computerworld. International Data Group. Archived from the original on October 22, 2020. Retrieved April 30, 2017.
  214. ^ Musil, Steven (December 12, 2012). "Google Maps returns to iOS as an app after Apple's removal". CNET. CBS Interactive. Archived from the original on November 12, 2020. Retrieved April 30, 2017.
  215. ^ a b Rodriguez, Salvador (December 13, 2012). "Google Maps returns to iPhone; iPad app coming soon". Los Angeles Times. Archived from the original on June 29, 2017. Retrieved April 30, 2017.
  216. ^ Arrington, Michael (October 28, 2009). "Google Redefines GPS Navigation Landscape: Google Maps Navigation For Android 2.0". TechCrunch. AOL. Archived from the original on October 22, 2020. Retrieved April 30, 2017.
  217. ^ Schroeder, Stan (October 28, 2009). "Google Maps Navigation Becomes Reality on Android". Mashable. Archived from the original on November 9, 2020. Retrieved April 30, 2017.
  218. ^ Fingas, Jon (July 16, 2013). "Google Maps 2.0 for iOS starts rolling out with iPad support, indoor maps (update: offline maps too)". Engadget. AOL. Archived from the original on October 23, 2020. Retrieved April 30, 2017.
  219. ^ Ingraham, Nathan (June 27, 2012). "Google Maps for Android now supports saving maps for offline use". The Verge. Vox Media. Archived from the original on November 11, 2020. Retrieved April 30, 2017.
  220. ^ Lawler, Richard (June 27, 2012). "Google Maps offline for Android is available today in version 6.9, also Compass Mode for Street View". Engadget. AOL. Archived from the original on November 12, 2020. Retrieved April 30, 2017.
  221. ^ Kastrenakes, Jacob (May 6, 2014). "Google Maps for iOS and Android add offline support, lane guidance, and Uber integration". The Verge. Vox Media. Archived from the original on November 11, 2020. Retrieved April 30, 2017.
  222. ^ Siegal, Jacob (May 6, 2014). "Google Maps just got a huge update – here are the 5 best new features". BGR. Penske Media Corporation. Archived from the original on November 8, 2020. Retrieved April 30, 2017.
  223. ^ Pierson, David (November 11, 2015). "Google Maps Now Available Offline". Government Technology. Los Angeles Times. Archived from the original on October 30, 2020. Retrieved November 19, 2016.
  224. ^ "Download areas and navigate offline – iPhone & iPad". Google Maps Help. Archived from the original on December 19, 2020. Retrieved November 19, 2016.
  225. ^ Bolton, Doug (January 25, 2016). "How to use Google Maps on your smartphone when you don't have a connection". The Independent. Archived from the original on October 22, 2020. Retrieved June 29, 2018.
  226. ^ McCourt, David (May 30, 2019). "How to use Google Maps offline". AndroidPIT. Archived from the original on October 23, 2020. Retrieved November 19, 2016.
  227. ^ Welch, Chris (January 26, 2017). "Google Maps now tells you how hard it is to park in some cities". The Verge. Vox Media. Archived from the original on November 12, 2020. Retrieved December 7, 2017.
  228. ^ Haselton, Todd (April 26, 2017). "How to use a new Google Maps feature to help you find your parked car". CNBC. NBCUniversal News Group. Archived from the original on November 11, 2020. Retrieved December 7, 2017.
  229. ^ Sawers, Paul (April 26, 2017). "Google Maps now makes it easier to remember where you parked your car". VentureBeat. Archived from the original on November 11, 2020. Retrieved December 7, 2017.
  230. ^ Gartenberg, Chaim (August 29, 2017). "Google Maps will now help you find parking". The Verge. Vox Media. Archived from the original on November 11, 2020. Retrieved December 7, 2017.
  231. ^ Ghoshal, Abhimanyu (December 5, 2017). "Google Maps' new two-wheeler mode shows faster routes for beating traffic on your bike". The Next Web. Archived from the original on December 7, 2020. Retrieved December 7, 2017.
  232. ^ Jonnalagadda, Harish (December 5, 2017). "Google Maps gets a dedicated two-wheeler mode in India". Android Central. Mobile Nations. Archived from the original on December 7, 2017. Retrieved December 7, 2017.
  233. ^ "Google Maps AR directions released". CNBC. February 21, 2020. Retrieved February 21, 2020.[dead link]
  234. ^ Kastrenakes, Jacob (May 20, 2020). "Here are all the winners of the 2020 Webby Awards". The Verge. Archived from the original on May 21, 2020. Retrieved May 22, 2020.
  235. ^ "Google Maps update will let you draw in missing roads". Trusted Reviews. March 11, 2021. Archived from the original on March 31, 2021. Retrieved March 30, 2021.
  236. ^ Mehta, Ivan (June 14, 2022). "Google Maps will now show you info about toll pricing on your route". TechCrunch. Archived from the original on June 14, 2022. Retrieved June 14, 2022.
  237. ^ "Baig: Google Maps app – welcome return of an old friend". USA Today. Gannett Company. December 13, 2012. Archived from the original on November 25, 2020. Retrieved April 30, 2017.
  238. ^ Parker, Jason (November 6, 2014). "Google Maps for iOS review". CNET. CBS Interactive. Archived from the original on October 22, 2020. Retrieved April 30, 2017.
  239. ^ Fowler, Bree (December 16, 2012). "App review: Google Maps on iOS is back with a bang". FirstPost. Archived from the original on October 9, 2019. Retrieved April 30, 2017.
  240. ^ Diaz, Jesus (December 13, 2012). "Google Maps for iOS Review: Maps Done Right". Gizmodo. Univision Communications. Archived from the original on October 24, 2020. Retrieved April 30, 2017.
  241. ^ Pogue, David (December 12, 2012). "Maps App for iPhone Steers Right". The New York Times. Archived from the original on November 12, 2020. Retrieved April 30, 2017.
  242. ^ Tweney, Dylan (August 17, 2014). "Yes, Google Maps is tracking you. Here's how to stop it". VentureBeat. Archived from the original on October 2, 2017. Retrieved April 30, 2017.
  243. ^ Elliott, Matt (April 20, 2017). "Is Google is tracking you? Find out here". CNET. CBS Interactive. Archived from the original on April 29, 2017. Retrieved April 30, 2017.
  244. ^ Kumparak, Greg (December 18, 2013). "Google's Location History Browser Is A Minute-By-Minute Map Of Your Life". TechCrunch. AOL. Archived from the original on May 11, 2017. Retrieved April 30, 2017.
  245. ^ Mirani, Leo (April 3, 2014). "Google's sneaky new privacy change affects 85% of iPhone users—but most of them won't have noticed". Quartz. Atlantic Media. Archived from the original on August 10, 2017. Retrieved April 30, 2017.
  246. ^ El Khoury, Rita (March 9, 2019). "Google Maps hits 5 billion downloads on the Play Store, does it after YouTube but before the Google app". Android Police. Archived from the original on October 8, 2019. Retrieved October 26, 2019.
  247. ^ "Google Maps navigates its way to 10 billion installs". Android Police. November 4, 2021. Archived from the original on January 25, 2022. Retrieved January 25, 2022.
  248. ^ El Khoury, Rita (January 17, 2018). "[Update: APK Download] Google Maps Go shows up on the Play Store for Go phones, but you can give it a try anyway". Android Police. Archived from the original on June 3, 2018. Retrieved October 26, 2019.
  249. ^ Hager, Ryne (September 26, 2018). "[Update: Maps Go too] Google Go hits 10 million installs on Play Store - an indicator of Android Go's success?". Android Police. Archived from the original on September 26, 2019. Retrieved October 26, 2019.
  250. ^ Richterich, Annika (November 2011). "Cartographies of Digital Fiction: Amateurs Mapping a New Literary Realism". The Cartographic Journal. 48 (4): 237–249. Bibcode:2011CartJ..48..237R. doi:10.1179/1743277411Y.0000000021. ISSN 0008-7041. S2CID 131524536. Archived from the original on October 20, 2022. Retrieved October 16, 2022.
  251. ^ "THE GOOGLE TRILOGY - E M I L I O V A V A R E L L A". April 18, 2013. Archived from the original on October 16, 2022. Retrieved October 16, 2022.
  252. ^ group_inou (April 14, 2016). "EYE (music video)". YouTube. Archived from the original on October 16, 2022. Retrieved October 16, 2022.
  253. ^ Arcade Fire; Milk, Chris (2011). "The Wilderness Downtown". The Wilderness Downtown. Archived from the original on October 16, 2022. Retrieved October 16, 2022.
  254. ^ Arcade Fire (2011). "Video documentation of Wilderness Within video". YouTube. Archived from the original on October 16, 2022. Retrieved October 16, 2022.
[edit]
 

 

A web directory or link directory is an online list or catalog of websites. That is, it is a directory on the World Wide Web of (all or part of) the World Wide Web. Historically, directories typically listed entries on people or businesses, and their contact information; such directories are still in use today. A web directory includes entries about websites, including links to those websites, organized into categories and subcategories.[1][2][3] Besides a link, each entry may include the title of the website, and a description of its contents. In most web directories, the entries are about whole websites, rather than individual pages within them (called "deep links"). Websites are often limited to inclusion in only a few categories.

There are two ways to find information on the Web: by searching or browsing. Web directories provide links in a structured list to make browsing easier. Many web directories combine searching and browsing by providing a search engine to search the directory. Unlike search engines, which base results on a database of entries gathered automatically by web crawler, most web directories are built manually by human editors. Many web directories allow site owners to submit their site for inclusion, and have editors review submissions for fitness.

Web directories may be general in scope, or limited to particular subjects or fields. Entries may be listed for free, or by paid submission (meaning the site owner must pay to have his or her website listed).

RSS directories are similar to web directories, but contain collections of RSS feeds, instead of links to websites.

History

[edit]

During the early development of the web, there was a list of web servers edited by Tim Berners-Lee and hosted on the CERN webserver. One historical snapshot from 1992 remains.[4] He also created the World Wide Web Virtual Library, which is the oldest web directory.[5]

Scope of listing

[edit]

Most of the directories are general in on scope and list websites across a wide range of categories, regions and languages. But some niche directories focus on restricted regions, single languages, or specialist sectors. For example, there are shopping directories that specialize in the listing of retail e-commerce sites.

Examples of well-known general web directories are Yahoo! Directory (shut down at the end of 2014) and DMOZ (shut down on March 14, 2017). DMOZ was significant due to its extensive categorization and large number of listings and its free availability for use by other directories and search engines.[6]

However, a debate over the quality of directories and databases still continues, as search engines use DMOZ's content without real integration, and some experiment using clustering.

Development

[edit]

There have been many attempts to make building web directories easier, such as using automated submission of related links by script, or any number of available PHP portals and programs. Recently, social software techniques have spawned new efforts of categorization, with Amazon.com adding tagging to their product pages.

Monetizing

[edit]

Directories have various features in their listings, often depending upon the price paid for inclusion:

  • Cost
    • Free submission – there is no charge for the review and listing of the site
    • Paid submission – a one-time or recurring fee is charged for reviewing/listing the submitted link
  • No follow – there is a rel="nofollow" attribute associated with the link, meaning search engines will give no weight to the link
  • Featured listing – the link is given a premium position in a category (or multiple categories) or other sections of the directory, such as the homepage. Sometimes called sponsored listing.
  • Bid for position – where sites are ordered based on bids
  • Affiliate links – where the directory earns commission for referred customers from the listed websites
  • Reciprocity
    • Reciprocal link – a link back to the directory must be added somewhere on the submitted site in order to get listed in the directory. This strategy has decreased in popularity due to changes in SEO algorithms which can make it less valuable or counterproductive.[7]
    • No Reciprocal link – a web directory where you will submit your links for free and no need to add link back to your website

Human-edited web directories

[edit]

A human-edited directory is created and maintained by editors who add links based on the policies particular to that directory. Human-edited directories are often targeted by SEOs on the basis that links from reputable sources will improve rankings in the major search engines. Some directories may prevent search engines from rating a displayed link by using redirects, nofollow attributes, or other techniques. Many human-edited directories, including DMOZ, World Wide Web Virtual Library, Business.com and Jasmine Directory, are edited by volunteers, who are often experts in particular categories. These directories are sometimes criticized due to long delays in approving submissions, or for rigid organizational structures and disputes among volunteer editors.

In response to these criticisms, some volunteer-edited directories have adopted wiki technology, to allow broader community participation in editing the directory (at the risk of introducing lower-quality, less objective entries).

Another direction taken by some web directories is the paid for inclusion model. This method enables the directory to offer timely inclusion for submissions and generally fewer listings as a result of the paid model. They often offer additional listing options to further enhance listings, including features listings and additional links to inner pages of the listed website. These options typically have an additional fee associated but offer significant help and visibility to sites and/or their inside pages.

Today submission of websites to web directories is considered a common SEO (search engine optimization) technique to get back-links for the submitted website. One distinctive feature of 'directory submission' is that it cannot be fully automated like search engine submissions. Manual directory submission is a tedious and time-consuming job and is often outsourced by webmasters.

Bid for Position directories

[edit]

Bid for Position directories, also known as bidding web directories, are paid-for-inclusion web directories where the listings of websites in the directory are ordered according to their bid amount. They are special in that the more a person pays, the higher up the list of websites in the directory they go. With the higher listing, the website becomes more visible and increases the chances that visitors who browse the directory will click on the listing.

Propagation

[edit]

Web directories will often make themselves accessing by more and more URLs by acquiring the domain registrations of defunct websites as soon as they expire, a practice known as Domain drop catching.

See also

[edit]
Link destinations
Types of web directory
Other link organization and presentation systems

References

[edit]
  1. ^ "Web directory". Dictionary.com. Retrieved 11 November 2023.
  2. ^ Wendy Boswell. "What is a Web Directory". About.com. Archived from the original on 2010-01-07. Retrieved 2010-02-25.
  3. ^ "Web Directory Or Directories". yourmaindomain. Retrieved 30 August 2013.
  4. ^ "World-Wide Web Servers". W3C. Retrieved 2012-05-14.
  5. ^ Aaron Wall. "History of Search Engines: From 1945 to Google Today". Search Engine History. Retrieved 2017-05-16.
  6. ^ Paul Festa (December 27, 1999), Web search results still have human touch, CNET News.com, retrieved September 18, 2007
  7. ^ Schmitz, Tom (August 2, 2012). "What Everyone Needs To Know About Good, Bad & Bland Links". searchengineland.com. Third Door Media. Retrieved April 21, 2017. Reciprocal links may not help with competitive keyword rankings, but that does not mean you should avoid them when they make sound business sense. What you should definitely avoid are manipulative reciprocal linking schemes like automated link trading programs and three-way links or four-way links.
[edit]

 

Frequently Asked Questions

Local SEO services in Sydney focus on optimizing a business's online presence to attract local customers. This includes claiming local business listings, optimizing Google My Business profiles, using location-specific keywords, and ensuring consistent NAP (Name, Address, Phone) information across the web.

SEO agencies in Sydney typically offer comprehensive services such as keyword research, technical audits, on-page and off-page optimization, content creation, and performance tracking. Their goal is to increase your site's search engine rankings and drive more targeted traffic to your website.

A local SEO agency specializes in improving a business's visibility within a specific geographic area. They focus on optimizing local citations, managing Google My Business profiles, and targeting location-based keywords to attract nearby customers.

SEO packages in Australia typically bundle essential optimization services such as keyword research, technical audits, content creation, and link building at a set price. They are designed to simplify the process, provide consistent results, and help businesses of all sizes improve their online visibility.

Content marketing and SEO work hand-in-hand. High-quality, relevant content attracts readers, earns backlinks, and encourages longer time spent on your site'factors that all contribute to better search engine rankings. Engaging, well-optimized content also improves user experience and helps convert visitors into customers.