TikTok’s long political saga in the United States was supposed to end with a simple trade. Put the app inside an American structure, keep the service online, reduce national security risk.
The deal that closed on January 22, 2026 did create a new US joint venture to operate TikTok’s American business. It also revealed something many users never wanted to learn in the first place, ownership is not the same thing as privacy.
In the same window, TikTok updated its US privacy policy to allow collection of precise location data, a step beyond the approximate location signals it has relied on for years. The change matters less as a headline, and more as a signal about where the product is going next.
That is the post deal reality. TikTok can become more American on paper while still expanding the categories of data it is willing to collect.
Key points to know
- Precise location collection is now permitted under the policy, tied to user settings and permissions.
- The policy also expands how TikTok describes AI related data collection, including prompts and interaction metadata.
- A new ownership structure can change who controls the pipes, without shrinking what the platform wants to measure.
What changed in the privacy policy
The headline update is location. TikTok has long inferred where users are through signals like IP addresses and SIM region, which usually lands in the approximate bucket. The revised language opens the door to GPS level precision depending on user settings.
This is not a guarantee that every US user is suddenly being tracked at GPS precision. It is a policy authorization, the company is telling you it wants the ability to do it when the product calls for it, and when users grant permission.
That distinction is important, but it does not erase the shift. Precise location is a more sensitive data type because it can reveal routines, workplaces, medical visits, schools, and patterns that are harder to anonymize than people assume.
Why the joint venture matters, and why it does not
The new corporate arrangement is designed to address national security concerns. The story that comes with it is familiar, US user data will be secured through Oracle’s cloud infrastructure, and core systems like the recommendation engine will be rebuilt around US operations.
That can change the governance surface, including who has administrative access, what legal frameworks apply, and how oversight is structured. It does not automatically change the economic logic of the app.
TikTok competes on relevance, retention, and monetization. Those goals tend to pull toward more measurement, not less, especially as the platform adds new features that depend on context, proximity, and real world intent.
The quiet expansion, AI interactions as a data category
The other meaningful change is the way TikTok now describes AI related information. The policy language is broader about collecting prompts, questions, user inputs, and contextual metadata about how content is created and shared.
For users, the practical effect is simple. AI features rarely arrive alone. They arrive with logging, quality monitoring, safety review processes, and the kind of instrumentation teams use to make models behave in production.
For regulators, the implications are also simple. The more a platform collects, the more it needs to justify why it collected it, how long it retains it, and who can access it.
What this means for AI and ML learners
If you are learning AI and machine learning, this is a good moment to widen your definition of the job. It is not only about building models. It is about building systems that can survive scrutiny, user skepticism, and changing rules.
Location data and behavioral telemetry are powerful inputs, and they are also the fastest path to reputational damage when consent feels rushed or unclear. In practice, responsible AI work often looks like boring product decisions, tighter defaults, clearer permission flows, data minimization, and sane retention limits.
That is the tension. Real world AI needs data. Real world users want boundaries.
What to do now
If you do not want TikTok to have access to your precise location, treat this as a settings problem, not a news cycle problem. Keep location services disabled for the app, and revisit permissions after major updates.
If you build apps for a living, this is also a reminder to get serious about fundamentals like permission design and data hygiene. A practical starting point is a mobile app security standards checklist, because privacy failures often begin as security and governance failures.
And if you are entering AI professionally, do not treat privacy as a niche specialization. It is quickly becoming baseline competence, right alongside model evaluation and deployment. Spend time on the cybersecurity skills that matter, because the companies that win the next phase of AI will be the ones that can prove they deserve the data they ask for.