You’ve probably seen it in analytics dashboards, server logs, or privacy documentation: IP addresses with their last octet zeroed out. 192.168.1.42 becomes 192.168.1.0. For IPv6, maybe the last 64 or 80 bits are stripped. This practice is widespread, often promoted as “GDPR-compliant pseudonymization,” and implemented by major analytics platforms, log aggregation services, and web servers worldwide.
There’s just one problem: truncated IP addresses are still personal data under GDPR.
If you’re using IP address truncation thinking it makes data “anonymous” or “non-personal,” you’re creating a false sense of security that likely puts you out of compliance with GDPR. European data protection authorities, including the French CNIL, Italian Garante, and Austrian DPA, have repeatedly ruled that truncated IPs remain personal data, especially when combined with other information that most systems routinely collect.
This isn’t a matter of opinion or a marginal edge case: it’s a fundamental misunderstanding of what constitutes effective anonymization, and it’s being exploited by vendors who should know better.
What Is IP Address Truncation?
IP address truncation is the practice of zeroing out some portion of an IP address in an attempt to make it less identifiable. The most common approaches include:
- IPv4: Setting the last octet to zero, converting 192.168.1.42 to 192.168.1.0 (creating a /24 subnet mask)
- IPv6: Zeroing out various amounts of bits: sometimes 64 bits (/64), sometimes 80 bits (/48), sometimes arbitrary amounts depending on someone’s best guess
The logic seems sound at first glance: by removing the specific host identifier and keeping only the network portion, you’ve grouped the user with others in the same subnet, making them less identifiable. It’s simple to implement, just a bitwise operation, and it produces an output that still looks like an IP address, which is convenient for existing tools and databases.
This simplicity explains why truncation became so popular. But simplicity in implementation doesn’t equate to effectiveness in protection.
Why Truncation Fails: The Technical Reality
The fundamental flaw in IP address truncation is that it assumes subnet size correlates with user anonymity. It doesn’t.
The Small Subnet Problem
Consider a common scenario: a small business with a /24 IPv4 allocation (256 addresses). When you “anonymize” 192.168.1.42 to 192.168.1.0, you might think you’re hiding that user among 256 possible addresses. But in reality:
- That entire /24 subnet might be allocated to a single company
- Within that company, there might be only 20 employees with internet access
- At the specific timestamp in your logs, perhaps only one or two people were online
Now imagine someone wants to identify that user. They simply need to:
- Look at the truncated IP: 192.168.1.0/24
- Query WHOIS databases to see it belongs to “Acme Corporation”
- Ask the ISP or company: “Who was accessing the internet from your network at timestamp X?”
- Get an answer identifying a specific individual
The truncation provided zero meaningful anonymization. The network prefix itself, the part you kept, contains enough information for trivial identification.
Geography and ISP Are Visible in Plain Text
Even worse, the retained network prefix reveals:
- The Internet Service Provider (visible via WHOIS)
- The geographic location (often down to city level)
- The organization (for business connections)
- The hosting provider (for VPN/cloud services)
This information doesn’t require any special access or “additional information” to obtain: it’s publicly available from the IP address itself. This will become critical when we discuss GDPR compliance.
IPv6 Makes Everything Worse
If IPv4 truncation is bad, IPv6 truncation is catastrophic. Unlike IPv4, where there was at least a convention (the /24 boundary), IPv6 has no standardized truncation approach:
- Some implementations zero the last 64 bits (Interface ID), keeping the /64 prefix
- Others zero 80 bits, keeping the /48 prefix
- Some pick arbitrary boundaries and hope for the best
The problem is that IPv6 address allocation is typically very sparse. A residential ISP might allocate an entire /48 or /56 to a single household. That’s not 256 addresses; that’s potentially trillions of addresses, all going to one family’s router.
When you “anonymize” 2001:db8:cafe:b00c::1 to 2001:db8:cafe::/48, you haven’t hidden the user among thousands of other users. You’ve just removed some bits while leaving their ISP allocation, which might serve only their household, completely visible.
There is no principled way to choose a truncation boundary for IPv6. The subnet size tells you nothing about the number of actual users within it.
Why Truncation Fails: The Legal Reality
Beyond the technical failures, IP address truncation creates significant GDPR compliance risks that many organizations don’t recognize.
The Core Legal Problem: Truncated IPs Are Still Personal Data
The critical legal issue isn’t whether truncation violates GDPR, it’s that truncated IP addresses remain personal data under GDPR in virtually all real-world contexts.
The Court of Justice of the European Union established the framework in Breyer v. Germany (C-582/14): dynamic IP addresses constitute personal data when the data controller has legal means to obtain additional information from an Internet Service Provider to identify the visitor. The court applied a “reasonable means” test: data is personal if someone can identify individuals using “all means reasonably likely to be used.”
Multiple European data protection authorities have applied this principle specifically to truncated IP addresses:
French CNIL (in Google Analytics enforcement, 2022): “The communication of an IP address, even if truncated, can contribute to the subsequent re-identification of the individual concerned.” The authority found that truncated IPs combined with other metadata (Client ID, visited pages, browser information, timestamps) made visitors identifiable.
Italian Garante (Google Analytics enforcement, 2022): “An IP address is personal data and would not be anonymised even if it were truncated; given Google’s capabilities to enrich such data through additional information it holds.”
Austrian DPA (2022): Rejected the claim that persistent identifiers combined with truncated IPs don’t identify individuals, ruling the data remains personal.
Why Truncated IPs Remain Personal Data
In practice, truncated IP addresses are personal data because:
-
They’re combined with other identifying data: Logs contain timestamps, user agents, visited URLs, referrers, session IDs, and cookies. The combination creates unique fingerprints that identify individuals even when the IP is truncated.
-
The controller often has additional information: Organizations with user accounts, payment data, or analytics platforms have vast datasets that can be cross-referenced with truncated IPs to identify specific users.
-
The subnet itself is often identifying: As discussed earlier, a /24 subnet might serve a single company, and even larger IPv6 subnets often map to individual households.
-
ISP lookup remains trivial: Geographic and organizational information visible in the truncated prefix enables targeted ISP queries that identify users with minimal effort.
The Real GDPR Violation
Truncation itself doesn’t violate GDPR. The violation occurs when organizations treat truncated IPs as “anonymous” or “non-personal” data and then:
- Share data with third parties (like analytics providers) without adequate legal basis
- Retain data longer than necessary, citing “it’s anonymized”
- Skip consent requirements, believing GDPR doesn’t apply
- Fail to honor data subject access requests for “anonymized” data
- Neglect security measures appropriate for personal data
- Transfer data internationally without proper safeguards
The 2025 EDPB Guidelines on Pseudonymisation reinforce this: “Pseudonymised data, which could be attributed to a natural person by the use of additional information, remains information related to an identifiable natural person, and thus is personal data.” Pseudonymization is a security measure that reduces risk; it’s not an exemption from GDPR obligations.
The False Sense of Compliance
The danger of IP truncation isn’t just that it provides weak protection. It’s that it creates a false sense of compliance. Organizations implement truncation, vendors market it as “GDPR-compliant anonymization,” and everyone proceeds as if the data is no longer subject to privacy regulations.
Many vendors and platforms market IP truncation as “GDPR-compliant anonymization.” Some go further, claiming it satisfies data minimization requirements or enables exemption from certain regulatory obligations. These claims are, at best, oversimplified. At worst, they’re creating significant compliance risks for organizations that rely on them.
But when data protection authorities investigate, as they did with Google Analytics across multiple EU countries, they consistently rule that truncated IPs remain personal data, and organizations using them must comply with the full scope of GDPR requirements. If your data protection impact assessment (DPIA) assumes IP addresses have been anonymized via truncation, you may want to revisit that assessment.
Other Broken Approaches
Before we discuss the solution, it’s worth briefly examining why other common approaches also fail:
Simple Hashing (SHA-256, etc.)
Hashing IP addresses without a key is trivially reversible for IPv4. There are only about 4 billion possible IPv4 addresses. Building a rainbow table that maps every possible IP address to its hash takes minutes on modern hardware. For IPv6, the space is larger, but targeted attacks against specific subnets remain feasible.
Keyed Hashing (HMAC)
Using a keyed hash like HMAC-SHA256 solves the rainbow table problem, but introduces others:
- It’s not reversible: once hashed, you can never recover the original IP, even with the key
- The output is binary data (32+ bytes) that needs encoding, producing long strings that don’t look like IP addresses
- It’s slower than necessary for such a small input
- Tools expecting IP address formats won’t work with the output
Generic Encryption (AES-GCM)
Using authenticated encryption like AES-GCM gives you security, but at a cost:
- Non-deterministic: each encryption requires a unique nonce, so the same IP produces different ciphertexts each time (breaking analysis that depends on counting repeated IPs)
- Size explosion: a 16-byte IPv6 address becomes 44+ bytes (16-byte ciphertext + 16-byte auth tag + 12-byte nonce), which must then be encoded
- Format loss: the output is binary data, not an IP address
These approaches fail to meet the practical requirements for IP anonymization: deterministic mapping, format preservation, reversibility, and performance.
The Solution: Cryptographic IP Address Encryption
What we actually need is encryption specifically designed for IP addresses. That’s what IPCrypt provides.
IPCrypt is a family of algorithms designed to encrypt IP addresses while preserving properties needed for real-world use cases. Unlike truncation, IPCrypt provides actual cryptographic protection: without the secret key, encrypted addresses are computationally indistinguishable from random IP addresses.
IPCrypt-Deterministic: Format-Preserving Encryption
The most straightforward variant, IPCrypt-Deterministic, works like this:
- Input: any IP address (IPv4 or IPv6)
- Output: another valid IP address
- Property: the same input always produces the same output (deterministic)
- Security: only the key holder can decrypt
For example:
Encrypt the same address again with the same key, and you get the same result. This enables analysis: you can count unique visitors, detect patterns, identify heavy hitters, all on encrypted data. Tools that expect IP addresses work normally, because the output is a valid IP.
But critically, without the key, you cannot determine the original IP address. The ISP information is hidden. The geographic location is hidden. The organization is hidden. This is actual protection, not theater.
IPCrypt-PFX: Prefix-Preserving Encryption
For more advanced use cases like network analysis, IPCrypt-PFX preserves network hierarchy:
Notice that addresses from the same subnet produce encrypted addresses that share a prefix. This enables abuse detection, network analysis, and pattern recognition on encrypted data without revealing the actual source networks.
Performance and Adoption
IPCrypt isn’t just theoretically sound: it’s practical:
- Fast: IPCrypt-Deterministic is a single AES block operation
- Compact: no size overhead: IP in, IP out
- Specified: IETF draft specification
- Implemented: libraries available in many languages including Javascript, C, Go, Rust, Java, Kotlin, Python, and Zig
- Adopted: already in use by DataDog, PowerDNS, and other major platforms
IPCrypt has been proven to achieve the best possible ciphertext size for its security properties and the best possible performance given the constraints.
What You Should Do
If you’re currently using IP address truncation:
- Recognize it for what it is: not anonymization, not effective pseudonymization, but rather a slight reduction in precision that leaves data personally identifiable
- Stop promoting it as privacy-protective: if you’re a vendor, stop marketing truncation as “GDPR-compliant anonymization”
- Migrate to proper encryption: implement IPCrypt or another cryptographically sound approach
- Review your compliance posture: if your DPIA or data handling procedures assume truncation provides anonymization, update them
If you’re building new systems:
- Start with IPCrypt: it’s not significantly more complex to implement than truncation, but provides actual protection
- Use deterministic mode for analytics and deduplication use cases
- Use prefix-preserving mode (PFX) for network analysis and abuse detection
- Keep the key safe: IPCrypt is only as secure as your key management
Conclusion
IP address truncation became common because it was simple and seemed reasonable. But “simple” doesn’t mean “correct,” and widespread adoption doesn’t make a fundamentally flawed technique sound.
The reality, confirmed by multiple European data protection authorities and court rulings, is straightforward:
- Truncated IPs are still personal data under GDPR in virtually all real-world contexts
- They reveal ISP, geography, and organization information in plain text
- When combined with other metadata (which systems routinely collect), they enable user identification
- Organizations treating truncated IPs as “anonymous” are likely in violation of GDPR requirements for consent, purpose limitation, data subject rights, and international transfers
- The technique creates a false sense of privacy protection that puts organizations at significant compliance risk
We now have better tools. IPCrypt provides cryptographically sound IP address encryption that’s both practical and performant. Unlike truncation, IPCrypt-encrypted addresses cannot be linked back to real IPs without the secret key; they provide actual protection, not just obscurity. The system has a specification, clear security guarantees, has been implemented in many programming languages, and has been adopted by major players such as DataDog and PowerDNS.
If you’re still using truncation, it’s time to stop. Not because you were wrong to try; truncation was a reasonable attempt at solving a hard problem before better solutions existed. But because we now have the right solution, and continuing to use the wrong one while calling it “anonymization” is misleading your organization and your users.
Your users deserve actual privacy protection, not privacy theater. And your organization deserves accurate information about its compliance posture, not false assurances based on ineffective techniques.
Legal References
The GDPR analysis in this post is based on:
- CJEU Case C-582/14 (Breyer v. Germany, 2016): Established that dynamic IP addresses constitute personal data when controllers have reasonable means to identify individuals
- French CNIL (Google Analytics enforcement, 2022): Ruled that truncated IPs combined with metadata remain personal data
- Italian Garante (2022): Found that truncated IPs are not anonymized given the controller’s capabilities to enrich data
- Austrian DPA (2022): Rejected claims that persistent identifiers with truncated IPs don’t identify individuals
- EDPB Guidelines 01/2025 on Pseudonymisation: Clarified that pseudonymized data remains personal data under GDPR
- Article 29 Working Party Opinion 05/2014 (WP216): Established standards for effective anonymization
.png)
