How Internet Applications Geolocate Users and Why It Needs a Rethink
Internet apps often use IP addresses to estimate user location. This post covers standards work to improve IP address privacy, their geolocation approach, and whether routing protocols should reveal location data.
By Divyank Katira, researcher at the Internet Research Lab and Internet of Rights Fellow with ARTICLE 19. Originally published on the PITG blog.
We have all encountered localized content on the internet – be it search engines that show results near you or a website that displays content in your local language. Many web and mobile applications rely on a mechanism known as ‘IP-based geolocation’, wherein the IP address connecting to a server is used to estimate where a visitor might be located. IP location estimates are sourced from commercial services that rely on a number of open and proprietary signals to profile IP addresses and deduce their locations, with increased accuracy being a selling point for these services. Location estimates are considered to be accurate at the country level, but accuracy may drop at the city and zip code granularity.
IP-based geolocation has served as a quick-and-easy way for applications to show their users locally relevant content and to demarcate virtual borders that are used to comply with local regulations. While this approach may be convenient for companies and users alike, it neglects to consider the privacy implications of deriving private information about internet users from network layer metadata without their knowledge or consent. Even though IP-based geolocation has become the norm, there is an important need to deliberate on whether this is a desirable property of a network protocol, or simply one that has emerged from popular use, and whether it meets the privacy expectations of end-users.
IP geolocation is also being increasingly used to enact geo-blocking – a form of internet censorship where content is withheld from internet users based on their geographical location. When governments find it infeasible to block access to an entire online platform, they instead issue takedown orders to the platforms to block individual pieces of content. These platforms utilize IP-based geo-blocking to restrict access to content in the country. In India, for example, IP-based geo-blocking has become a predominant way for the government to conduct internet censorship. Reports indicate that out of the 6,775 pieces of content (including web pages, websites, apps, social media posts and accounts) blocked by the IT Ministry in 2022, about 50% were X posts and accounts and 25% were on Facebook.
Emerging recognition of the need to keep IP addresses private
Originally designed to identify routes to entities that can be reached through the internet, IP addresses have been (ab)used in a number of ways to glean information about end-users. This includes profiling internet users for behavioral advertising and abuse prevention, identifying individuals for law enforcement purposes, building IP reputation systems for spam and DDoS prevention, and geolocating users for localization and to comply with local laws.
Recognizing the privacy risks of IP addresses in profiling and identifying internet users, some jurisdictions have designated them as personally identifiable information for data protection purposes. A number of technical solutions, such as VPNs, proxies, mixnets and Tor, have emerged to obfuscate users’ IP addresses from the web services they visit, with each offering varying degrees of privacy-usability tradeoffs.
Participants at the IETF have also acknowledged the need to keep IP addresses private and are developing and deploying protocols to help internet users protect their IP address from the web servers they interact with. This work is primarily being done through the OHAI and MASQUE working groups, where participants are working on developing “privacy relays”.
Oblivious routing: ongoing standards work to improve IP address privacy
The go-to solution for obfuscating a user’s IP address from a web service they are trying to visit is to route the request through an intermediary server, so that the recipient sees the intermediary’s IP address and not the user’s. This is how VPNs and proxies operate. This design, however, shifts the privacy issue to a different entity, as the intermediary now has visibility into the user’s internet usage. To work around this, IETF participants are developing an “oblivious routing” pattern. In this approach, the request is routed through two intermediary servers operated by separate entities, neither of which is given a complete picture of the request. As long as the two intermediaries do not collude, they cannot see which web servers a user is communicating with, which allows internet use without revealing a user’s IP address.
The OHAI working group has developed the Oblivious HTTP standard which defines a way for specific applications that involve repeatedly querying information from a server to do so privately using oblivious routing. The MASQUE working group has developed more generic transport-level relay protocols that are suited for a wider range of use-cases, like web browsing. A MASQUE proxy can be used with or without oblivious routing, depending on the privacy properties required from the system.
While there (currently) is no singular definition of what privacy relays are, these are some examples of how they’ve been deployed: (1) Apple’s iCloud Private Relay is a subscription service that uses MASQUE proxies with oblivious routing to allow users to browse the internet while keeping their IP address private, (2) Apple’s Private Cloud Compute is experimenting with Oblivious HTTP to reduce the footprint of their user’s queries to AI models, (3) Google’s proposed IP Protection envisions MASQUE-based oblivious routing for a very limited set cases (third-party requests in incognito mode) in its Chrome browser, (4) Cloudflare’s Warp offers both free and paid versions of a VPN-like service that uses a MASQUE proxy for internet browsing, but without oblivious routing, and (5) Google’s Safe Browsing service uses Oblivious HTTP to enable users to privately query for unsafe URLs.
An opportune moment to rethink IP-based geolocation
If privacy relays get adopted more widely, internet applications will no longer be able to rely on metadata derived from a user’s IP address for the variety of purposes that they are used for today. Internet companies are working to establish alternate signals to provide this information to web servers in situations where they deem the metadata to be useful. For example, anonymous credential schemes, like those used in the Privacy Pass standard, are being used to distinguish human traffic from bots without using signals like IP addresses or CAPTCHAs.
When it comes to geolocating users through privacy relays, operators are looking to maintain the status quo by conveying geolocation information to web servers through alternate means. Both Apple’s iCloud Private Relay and Chrome’s proposed IP Protection, convey users’ IP geolocation through their relays by maintaining a pool of IPs in each region, and routing requests through a relay whose IP location corresponds to the user’s IP location. While Apple’s service offers users the choice to reduce the IP location granularity to a country-level, it does not allow users to opt-in or opt-out of geolocation sharing entirely. Recognising that it is expensive to maintain a pool of IP addresses in every potential user location, these companies have also proposed a new HTTP header to allow clients/browsers to directly convey geolocation information through any privacy relays that may be present.
Given the pervasive reliance on IP-based geolocation by much of the web, it is easy to see why these companies have taken a cautious approach in retaining support for it. But simultaneously, as we move away from IP metadata signals and design appropriate alternatives for them, it is important to deliberate upon whether geolocating users is truly a function of a network routing protocol or one that happened to emerge from its design, and how geolocation mechanisms can incorporate user privacy and agency.
Internet applications have incorrectly come to rely on network layer metadata to derive private information about internet users without their knowledge or consent. This metadata is also being misused to conduct internet censorship on a large scale. While it is not an easy task for companies to re-evaluate their assumptions on the free availability of geolocation data, it is in the best interest of end-users to start planning a migration to consensual forms of location sharing on the internet, and the arrival of IP privacy solutions at the IETF is an opportune moment to do so.
Standardization work on privacy relays, oblivious routing and IP geolocation is ongoing in the HTTPBIS, MASQUE and OHAI working groups at the IETF. These discussions could benefit from participation of the public interest technology community to advocate for migration to consensual forms of location sharing on the web.
OHCHR Explores How to Embed Human Rights in Technical Standards
This year the Office of the United Nations High Commissioner for Human Rights (OHCHR) and partners like the Alan Turing Institute, the Center for Democracy and Technology and the Global Network Initiative have brought together civil society, companies, governments and standards bodies at a series of consultations to explore how to embed human rights into technical standards.
In a report from a recent consultation in Geneva at the Palais de Nations, speakers stressed that standards are not neutral checklists but governance tools that can either entrench harm or promote justice. The group discussed what motivates companies to adopt rights-aligned standards, how to support smaller players, and how to build stronger links between human rights teams and technical teams.
A key takeaway was that creating standards that address human rights as well as technical goals will require cross-sector collaboration, practical guidance for SMEs, and clarity on the purpose of standards.
Stay tuned for information about the final consultation in the series that will be held next week online. Afterwards the OHCHR will publish the report with its findings and hopefully inspire ways forward for human rights considerations in technical designs.
Support the Internet Exchange
If you find our emails useful, consider becoming a paid subscriber! You'll get access to our members-only Signal community where we share ideas, discuss upcoming topics, and exchange links. Paid subscribers can also leave comments on posts and enjoy a warm, fuzzy feeling.
Not ready for a long-term commitment? You can always leave us a tip.
From the Group Chat 👥 💬
This week in our Signal community, we talked about:
Wikimedia’s challenge to the UK’s Online Safety Act (OSA) Categorisation Regulations. On Monday, the High Court of Justice dismissed the Wikimedia Foundation’s challenge to the regulations. Wikimedia fears that, if made a “Category 1” service under the new regulations, it would ultimately be required to verify user identities, potentially resulting in vandalism and causing a host of privacy and safety problems. (See our article, If It Breaks Wikipedia, It’s Probably Bad Policy.)
The Court’s ruling wasn’t a total loss: it did emphasize the responsibility of Ofcom and the UK government to ensure Wikipedia is protected as the OSA is implemented. The law has faced a good amount of criticisms from the privacy and security community. Kate Sim, director of the children’s online safety and privacy research program at the University of Western Australia’s Tech & Policy Lab, criticises the UK’s Online Safety Act as a surveillance-driven censorship tool that curtails political speech and youth access to information while enriching tech and age-verification firms. She also notes that there’s not a lot of transparency about what happens to that data after it’s collected. (For more background on OSA and age verification, see our article, The UK Online Safety Act: Ofcom’s Age Checks Raise Concerns.)
This is happening in the wider context of governments worldwide requiring age checks for accessing certain online content (pornography, social media, and in some cases even news). Common methods for checking age and identity are flawed, either because they are inaccurate, biased, or expose users to privacy and security vulnerabilities. Columbia University researcher Steven M. Bellovin finds that it is technically possible to build a privacy-preserving, credential-based age verification system for the web using cryptographic methods, and that the barriers to implementation are not technical but human, legal, and economic.
This Week's Links
Open Social Web
- Germ is an end-to-end encrypted direct messaging system that has now integrated with the AT Protocol, the open social protocol behind Bluesky. https://www.germnetwork.com/blog/integrating-germ-atproto
- Russian authorities announced Wednesday they were “partially” restricting calls in messaging apps Telegram and WhatsApp, the latest step in an effort to tighten control over the internet. https://eu.detroitnews.com/story/news/world/2025/08/13/russia-internet-restrictions/85650318007
- When people around the world are blocked from using WhatsApp, volunteers and organizations can create proxy servers that help people re-establish connection to WhatsApp and communicate freely and securely. Here's how: https://faq.whatsapp.com/1299035810920553
Internet Governance
- Denmark has revived a hard-line “Chat Control” plan to mandate scanning of all private messages, including encrypted ones, despite EU legal advisers warning it would violate fundamental rights. (Updated with a more accurate description and a link to the original German source material for the previously shared TechRadar article.) https://netzpolitik.org/2025/internes-protokoll-eu-juristen-kritisieren-daenischen-vorschlag-zur-chatkontrolle/
- Reddit says that AI companies have scraped data from the Wayback Machine, so it’s going to limit what the Internet Archive can access. https://www.theverge.com/news/757538/reddit-internet-archive-wayback-machine-block-limit
- A recent FTC “workshop” under Trump spread anti-trans disinformation, raising fears that proposed online safety laws like KOSA could be weaponised to censor LGBTQ+ content and erase queer communities from the internet, writes Fight For The Future campaigner Janus Rose. https://www.fastcompany.com/91377373/trumps-ftc-spreading-lies-about-trans-people-kosa
- At the heart of the growing debate in satellite governance is a simple question: Can a satellite system beam signals into a country or provide a service without that country’s explicit consent? From Lux Teixeira. https://www.techpolicy.press/global-fight-over-who-governs-communications-satellites-heats-up
- On Friday, four leading tech industry associations wrote to United States Secretary of State Marco Rubio calling on the State Department to advance Internet freedom and prioritize the preservation of the open, interoperable, secure Internet. https://globalnetworkinitiative.org/leading-tech-industry-associations-call-on-the-state-department-to-prioritize-internet-governance-and-internet-freedom
- A UN-led roundtable explored how tech companies and standards bodies can embed human rights principles into global technical standards. https://www.ohchr.org/en/documents/meeting-summaries/ohchr-gni-private-sector-leadership-human-rights-aligned-standard
Digital Rights
- A leaked list reveals that Meta scraped data from millions of websites, including news outlets, personal blogs, copyrighted and pirated material, and explicit content, to train its AI. https://www.dropsitenews.com/p/meta-facebook-tech-copyright-privacy-whistleblower
- Nvidia and AMD have struck a deal with the US government to pay 15% of their Chinese chip sales revenue in exchange for export licences. Critics say the arrangement undermines national security rationale, could violate the US Constitution’s ban on export taxes, and sets a dangerous precedent. https://www.bbc.co.uk/news/articles/cvgvvnx8y19o
- OONI data suggests that access to the social media platform X has been blocked in Tanzania since 20th May 2025. https://ooni.org/post/2025-tanzania-blocked-twitter
- Georgia Tech’s Internet Outage Detection and Analysis (IODA) project has launched a fully redesigned outage dashboard to improve monitoring and analysis of global internet connectivity disruptions. https://ioda.inetintel.cc.gatech.edu/reports/unlocking-connectivity-insights-georgia-techs-revamped-ioda-dashboard
Technology for Society
- Research ICT Africa has released Embodied data and AI: A collection of essays, edited by Anja Kovacs, which reframes data as “embodied,” e.g. integral to people’s lived bodily experiences, rather than as an abstract resource. https://researchictafrica.net/research/embodied-data-and-ai-a-collection-of-essays
- How have the recent mass layoffs and rebudgeting within the Western technology industry impacted the sustainability and maintenance of free and open-source software? New research from Mike Nolan and Kripa Kundaliya aims to answer. https://infrastructureinsights.fund/projects/infrastructure-in-recession
- True sustainability requires questioning how much internet we actually need and shifting from extractive growth to regenerative, climate-conscious technology, argues Dr. Fieke Jansen. https://pitg.network/news/2025/08/01/sustainable-infrastructure.html
- Spotlight on Alice Munyua, a Kenyan digital rights advocate and AI ethics expert who is shaping how technology intersects with social justice across Africa. https://reamby.substack.com/p/alice-munyua-on-championing-africas
- More than 85 gas-fired power facilities in development around the world to supply data centres and meet their burgeoning energy demands from AI, threaten to subvert global efforts to cut greenhouse gas emissions. https://www.ft.com/content/0f6111a8-0249-4a28-aef4-1854fc8b46f1
- Indigenous groups in Central and South America, from the high-altitude Rarámuri communities of northern Mexico to the Kichwa people of Ecuador’s rainforests, are operating their own community media and connectivity projects. https://www.itu.int/hub/2025/08/indigenous-voices-power-a-digital-revolution
- As part of his testing of the new GPT-5 models, Simon Willison ran his long-standing benchmark prompt to “generate an SVG of a pelican riding a bicycle.” GPT-5’s default model produced one of the most accurate and detailed versions yet. https://simonwillison.net/2025/Aug/7/gpt-5
- Pelicans aside, GPT-5 has mostly underwhelmed. https://techcrunch.com/2025/08/08/sam-altman-addresses-bumpy-gpt-5-rollout-bringing-4o-back-and-the-chart-crime
- A CDT webinar explored how content moderation systems handle low-resource languages like Maghrebi Arabic, Kiswahili, Tamil, and Quechua, highlighting research findings and recommendations for more equitable moderation. https://www.youtube.com/live/6YRZE8AmpTo
Privacy and Security
- The Flo & Meta lawsuit reveals that in wellness tech, particularly femtech, consent via privacy policies is insufficient; meaningful trust requires local data processing, minimising collection, rejecting surveillance-based SDKs, and prioritising bodily autonomy writes Mary Camacho. https://cirdia.com/blog/when-consent-isnt-trust-what-the-flo-and-meta-laws-mVhyQATEalg
- The Technical Advisory Panel, a small team of UK experts, advises on the technology used by spy agencies and police, focusing on AI, privacy, and surveillance risks—but concerns persist over how independent they can be from the agencies they oversee. https://www.computerweekly.com/news/366627619/Watching-the-watchers-Is-the-Technical-Advisory-Panel-a-match-for-MI5-MI6-and-GCHQ
- Let’s Encrypt has shut down its OCSP service to protect user privacy and streamline operations, shifting entirely to Certificate Revocation Lists for publishing revocation data. https://letsencrypt.org/2025/08/06/ocsp-service-has-reached-end-of-life
Upcoming Events
- 404 Media's Second Anniversary Party and Podcast Taping! August 21, 6pm ET. New York, NY. https://www.404media.co/404-media-second-anniversary-party
- IEEE Future Networks World Forum 2025. Opening day kenotes have been announced and include Peter Vetter, President, Bell Labs Core Research, Dr. Abhay Karandikar, Secretary, Department of Science & Technology (DST), India, and Dr. Vint Cerf, Vice President and Chief Internet Evangelist at Google. 10-12 November, 2025. Bangalore, India. https://fnwf2025.ieee.org
Careers and Funding Opportunities
- Consumer Reports: Operations Support Consultant, Permission Slip. New York, NY or Washington, D.C. https://innovation.consumerreports.org/opportunity-operations-support-consultant-permission-slip
- Call for concept notes from The International Development Research Centre (IDRC) and the United Kingdom’s Foreign, Commonwealth and Development Office (FCDO): Socio-economic impacts of artificial intelligence in Africa. African LMICs. Deadline September 17. https://idrc-crdi.ca/en/funding/call-concept-notes-socio-economic-impacts-artificial-intelligence-africa
- Global Partners Digital: Programme Lead, FOC Support Unit. Remote UK. https://apply.workable.com/global-partners-digital/j/70F1120BE0
Opportunities to Get Involved
The 2025 Fediverse Needs Assessment is open: have your say. Closes September 21. https://about.iftas.org/2025/08/11/the-2025-fediverse-needs-assessment-is-open-have-your-say
What did we miss? Please send us a reply or write to editor@exchangepoint.tech.