Commissioned, Curated and Published by Russ. Researched and written with AI. This is the living version of this post. View versioned snapshots in the changelog below.
What’s New
6 March 2026. Initial publication. US Customs and Border Protection officially acknowledges RTB data as a source for its location surveillance programme. 404 Media obtained the internal DHS document. EFF analysis published same week. Engineering response framework added.
Changelog
| Date | Summary |
|---|---|
| 6 Mar 2026 | Initial publication |
An app you shipped is probably in this chain. Not hypothetically. Not “if you weren’t careful.” Actually, right now, as a live system.
US Customs and Border Protection (CBP) has officially acknowledged – in an internal Department of Homeland Security document obtained by 404 Media – that it buys location data sourced from the online advertising ecosystem. Specifically from real-time bidding (RTB). Specifically from ordinary apps: video games, dating services, fitness trackers. The document states it plainly: “RTB-sourced location data is recorded when an advertisement is served.”
This is not a civil liberties post. The EFF has that covered, and they’re better placed to make that argument. This is a product engineering post. It’s about what the ad SDK you bundled three years ago is actually doing, where that data goes after it leaves your user’s phone, and what your engineering responsibility looks like given that you now know.
How the Data Actually Flows
The path from your app to a CBP analyst’s screen has about six hops. Most engineers who’ve shipped ad-supported apps have only ever seen the first one.
Your app collects location data – either because your app genuinely needs it (fitness tracker, navigation, weather) or because an SDK you bundled asks for the permission and you granted it during integration. The app is assigned an Advertising ID (AdID) – IDFA on iOS, GAID on Android. This is the persistent device identifier that ties all the dots together.
An ad SDK embedded in your app participates in the real-time bidding auction when your app loads an ad slot. To bid for that slot, it transmits a bid request. That bid request contains: the AdID, a timestamp, the precise GPS coordinates of the device, and often additional signals – device model, OS version, IP address, app category, inferred demographics. This happens in roughly 100 milliseconds, invisibly, every time an ad is served.
Supply-side platforms (SSPs) receive the bid request from your SDK and broadcast it to dozens or hundreds of demand-side platforms (DSPs). This broadcast is called a bid stream. Every DSP that receives the broadcast sees the full payload – including the location data – whether or not they win the auction. You sent that data to every participant in the auction, not just the winner.
Data brokers sit downstream. Companies like Venntel, Babel Street, and others aggregate bid stream data across thousands of apps and millions of devices. They are not ad tech companies. They are surveillance infrastructure companies that monetise the exhaust of the ad ecosystem. They sell datasets: timestamped GPS traces tied to device IDs, sometimes with dwell-time analysis, movement pattern inference, and home/work location identification already applied.
Government agencies – CBP, ICE, the FBI – purchase these datasets. No warrant required. The legal theory is that you voluntarily shared this data with an advertising network. Third-party doctrine. The Supreme Court’s Carpenter v. United States ruling (2018) created a warrant requirement for historical cell-site location data from carriers, but the government has consistently argued that commercially purchased data from data brokers is different. Courts have not yet resolved this cleanly.
The full pipeline: your app – SDK – SSP – bid stream broadcast – data broker aggregation – government purchase – analyst query.
You wrote the first node. You didn’t build the rest. But you participate in all of it.
What CBP Actually Got
The DHS document obtained by 404 Media describes a 2019-2021 pilot programme. CBP used what it called “commercially available marketing location data” for surveillance. The document acknowledges two technical sources: SDK-collected data and RTB bid stream data.
What that means in practice: CBP could query a device ID and get back a time-series of locations. Where a phone was at 6am. Where it was at 9am. Whether it stopped at a particular address. Whether it crossed a particular border or showed up near a particular facility. The EFF analysis describes this as “a goldmine for tracking where every person is and what they read, watch, and listen to” – quoting Johnny Ryan of the Irish Council for Civil Liberties.
This is not abstract. CBP and ICE have used this data to identify immigrants who were subsequently arrested. ICE also recently purchased a tool called Webloc, which gathers locations of millions of phones and allows filtering by AdID – the same identifier your ad SDK transmits. ICE has additionally published procurement notices asking vendors about “Ad Tech” data for investigations.
The thing that makes this different from older surveillance stories is that the collection is indiscriminate. It’s not targeted. The RTB system is vacuuming up location data from every device that loads an ad in any app in the ecosystem. CBP’s contribution is simply the purchase and query. The collection was already happening. Your SDK was already running.
Why “We Don’t Sell Data” Is No Longer a Defence
Most engineering teams have had this conversation at some point. Someone raises a concern about user data. The product or legal team points to the privacy policy. “We don’t sell user data.” The concern is noted and moved past.
That defence has a structural problem: the ad SDK you ship does.
When you bundle an ad SDK, you are embedding a third-party data collection system into your app. That system operates under the SDK vendor’s terms, not yours. The SDK vendor’s business model is, in many cases, exactly the monetisation of that data. You don’t have to sell data. The SDK vendor sells data. Your users’ data. Collected via your app. Under permissions your app requested.
This is not a hypothetical edge case. Most apps include three to five ad or analytics SDKs. The majority of engineers who integrated those SDKs have not read the SDK vendor’s data processing agreements in full. The majority have not audited what signals the SDK is actually transmitting. The majority have not reviewed what the SDK vendor’s downstream data sharing arrangements look like.
The FTC has begun taking the position that companies are responsible for the data practices of SDKs they ship. The ICO in the UK has taken enforcement action against publishers for RTB-related consent failures. The Irish DPC has issued significant fines to companies that were technically “not selling data” but whose SDK stack was systematically exfiltrating it.
The legal exposure is shifting from “did you sell data” to “did you ensure the data your app collected was handled lawfully by all parties in the chain.” That’s a much harder question, and most engineering teams are not currently in a position to answer it confidently.
The Engineering Response: SDK Audit and Data Minimisation in Practice
This section is for engineering teams who want to actually do something about it, rather than pass it to legal.
Step 1: SDK Inventory
Most teams don’t have a complete picture of what SDKs are in their app. Start there. For mobile apps, tools like Exodus Privacy (Android), AppCensus, or a manual review of your dependency tree will surface what’s bundled. For each SDK, identify: who makes it, what data it transmits, what their privacy policy says about downstream sharing, and whether they participate in any data broker relationships.
This is grunt work. It usually produces surprises. SDKs that were integrated for one purpose (attribution, crash reporting, A/B testing) often have data collection hooks that go well beyond that purpose.
Step 2: Permission Audit
Location permission requests should be tied to genuine user-facing functionality. If your app requests precise location but the only consumer of that permission is an ad SDK, that is a problem – both ethically and, increasingly, legally. On iOS, you can use CLLocationManager with requestWhenInUseAuthorization for genuinely in-app use cases and audit which SDK is calling requestAlwaysAuthorization. On Android, the permission audit is harder but ACCESS_FINE_LOCATION vs ACCESS_COARSE_LOCATION is the right starting point.
The principle is: if a user would be surprised that their GPS coordinates were transmitted to an ad network, you have a consent architecture problem.
Step 3: Consent Architecture
GDPR Article 7 and the UK GDPR require that consent for personal data processing is specific, informed, freely given, and as easy to withdraw as to give. For RTB, the IAB’s Transparency and Consent Framework (TCF) is the industry’s attempted answer to this requirement. The Belgian DPA and others have found TCF non-compliant with GDPR in significant ways. That enforcement is ongoing.
The practical implication: if you’re relying on a consent banner that boxes the user into accepting “advertising partners” as a bloc, you may not have lawful basis for the data processing your SDK stack is doing. A proper consent architecture gives users meaningful, granular choice – including the ability to use the app without behavioural advertising data collection.
Step 4: Data Minimisation at the SDK Layer
Some SDK vendors allow configuration to reduce data collection. Coarse rather than precise location. No transmission when the device is in certain regions. Opt-out signals. Most engineers haven’t explored these options because they weren’t seen as priorities. Revisit them. Where an SDK doesn’t offer adequate minimisation controls and the data it collects is disproportionate to its purpose, that’s a vendor evaluation conversation.
Step 5: Alternatives to Ad-Funded Monetisation
This is the harder conversation. If your revenue model depends on behavioural advertising, the data flows described in this post are structurally built into your business. The engineering changes above reduce risk at the margins; they don’t change the fundamental dynamic.
Alternatives exist: subscription models, contextual advertising (which does not require user tracking), freemium with privacy-respecting premium tiers, one-time purchase. None of these are easy substitutions. But they are worth modelling, especially as the regulatory and liability landscape shifts.
What’s Coming
The CBP story is a US story, but the engineering implications are global.
GDPR and PECR enforcement is accelerating. The RTB system has been under regulatory scrutiny in the EU since at least 2019, when the Irish Council for Civil Liberties filed its original complaint. The Belgian DPA’s 2022 ruling finding TCF non-compliant with GDPR opened the door to broader enforcement action against publishers and ad tech companies. This US story – demonstrating that RTB data ends up in government surveillance programmes – will land in ongoing enforcement proceedings as concrete evidence of harm. Expect it to be cited.
US state privacy laws are expanding. California (CCPA/CPRA), Virginia (CDPA), Colorado, Connecticut, Texas, and others have enacted comprehensive privacy laws with provisions that cover location data. Illinois’s BIPA has already produced significant litigation. The trend is toward broader coverage, lower thresholds, and higher penalties.
The warrant question is coming back to court. Post-Carpenter, there is an unresolved tension between the government’s “commercially purchased data” theory and Fourth Amendment protections. This CBP story adds political pressure on top of legal pressure. Legislation like the Fourth Amendment Is Not For Sale Act has been introduced multiple times. The current political environment is unpredictable, but the legal pressure is not going away.
Liability is shifting toward publishers. As the regulatory framework matures, the “we just bundled the SDK” defence becomes less tenable. The FTC’s recent actions against companies for deceptive data practices have included liability for third-party SDK behaviour where the company had meaningful control over the choice to include the SDK. That trend will continue.
The Choice You Actually Have
The choice isn’t whether to participate in surveillance infrastructure. If your app is ad-supported and you’re using standard ad SDKs, you’re already participating. The decision was made when you integrated the SDK, if not earlier.
The choice you have right now is whether to do it knowingly.
Knowingly means: you’ve done the SDK audit. You understand what data your app is transmitting and to whom. You’ve evaluated whether your consent architecture holds up. You’ve modelled the engineering and commercial options for changing the data flows.
Most teams haven’t done this work because it wasn’t urgent. The CBP story makes it urgent. Not just because of the ethical dimension – though that’s real – but because the regulatory and liability environment is shifting fast enough that “we didn’t know” is going to be an increasingly thin shield.
The engineers who built these apps didn’t build a surveillance system. They built a game, a fitness tracker, a dating app. But the infrastructure they built on top of – the ad SDK stack, the RTB ecosystem, the data broker layer – is a surveillance system. Understanding that, and responding to it at the engineering layer, is the job now.
Sources: 404 Media – CBP Tapped Into the Online Advertising Ecosystem To Track Peoples’ Movements, EFF – The Government Uses Targeted Advertising to Track Your Location
Commissioned, Curated and Published by Russ. Researched and written with AI.