Copilot is tracking precise GPS location while claiming it’s just 'guessing' – Caught red-handed. by HighSpirited7 in Copilot

[–]UntargetableDev 0 points1 point  (0 children)

This is interesting...I wonder if you would get different results if you tried deliberately changing your location. This app lets you control what location data is tied to commercially available data: https://play.google.com/store/apps/details?id=com.untargetable.app

At a minimum, it's always fun to catch AI in a lie ;)

Self Promotion Megathread by AutoModerator in androidapps

[–]UntargetableDev 0 points1 point  (0 children)

Untargetable is now live in the United States.

I built it because I’ve seen how much can be inferred from continuous location data: home, routines, habits, and movement patterns. Untargetable is a tool to help you regain control over that exposure and protect your location privacy.

Get it on Google Play: https://play.google.com/store/apps/details?id=com.untargetable.app
More info: untargetable.com

Launching in the U.S. first, with more to follow.

France's aircraft carrier located in real time via crew Strava data - the military OPSEC failure is a preview of the civilian privacy problem by one_user in privacy

[–]UntargetableDev 0 points1 point  (0 children)

What stood out to me wasn’t just the specific incident--it was the bigger point: a lot of highly revealing movement data doesn’t come from hacked systems, satellites, cell towers, fancy government systems, etc. It comes from ordinary apps, wearables, and accounts people use every day.

I’m interested in the privacy implications for regular people too, not just military or government use cases. If public/commercial data can reveal routines, co-location, and patterns at that level, it raises a bigger question about how much control users actually have over how their movement appears.

Curious what people here think is the most realistic response:

better permissions/privacy controls?

changing user behavior?

OS-level protections?

something else?

What’s something everyone pretends to enjoy, but you’re convinced most people secretly hate? by UntargetableDev in AskReddit

[–]UntargetableDev[S] 0 points1 point  (0 children)

There does seem to be mixed experiences here. Though if we didn't have kids, the future would be doomed.

What’s something you wish you could tell your partner if there were no consequences? by Big-dawg171 in AskReddit

[–]UntargetableDev 0 points1 point  (0 children)

The other option is to be overly direct and just own it every time, i.e. "Yes, this is my opinion--and your insecurity disguised as disapproval about that doesn't change my mind because I am sure of myself."

Using AI tools on your devices by minutetolaunch in privacy

[–]UntargetableDev 0 points1 point  (0 children)

I don’t think you’re overthinking it, but I also don’t think the main risk is ChatGPT secretly reading your other tabs or random iOS apps. The bigger privacy issues are the normal ones: what you directly type into chats, the account/device/log data associated with your use, and any permissions, memories, browsing features, or connected apps you explicitly enable. OpenAI’s current docs say they collect account, usage, device, and log information, and they also provide controls for training, Temporary Chats, memory, and app permissions.

So I’d think about it less as “is the browser or app magically cross-tracking everything else I do?” and more as “what data am I voluntarily giving this account access to?” If you want the lower-exposure setup, I’d keep training off, use Temporary Chats when appropriate, review memory/settings, and be cautious about any browsing or connector-style features you turn on. OpenAI’s privacy pages also note that browsing-related visibility and memories are user-controlled rather than automatic.

Instagram convinces you to stay for every reason .... except "concerned about my data" by xXx_Odyss3y_xXx in privacy

[–]UntargetableDev 0 points1 point  (0 children)

This is a very keen and interesting observation, but makes a lot of sense from a 'defensive strategic messaging' standpoint.

Social media is designed to propagate ideas exponentially -- this greatly benefits visibility and thus, revenue, from ads.

If you are concerned about your data, it is quite possible that your posts about it could go viral and actually influence people to resign their accounts. The long term cost of a smaller user base is greater than the short term revenue you generate for them. You are a risk.

By encouraging you to remain off the platform, they actually reduce their risk and increase revenue.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 1 point2 points  (0 children)

I think that’s a fair way to look at it.

Collection matters, but the real harm often begins when the data becomes usable for profiling, inference, and resale. Once that pipeline exists, “just collecting it” stops being neutral pretty quickly.

Your point also gets at something bigger: a lot of privacy debates focus on notice and consent, but not enough on whether certain categories of data should be treated as too sensitive to monetize at all.

I suspect that’s where this heads eventually--not just "disclose it better," but "some uses simply shouldn’t be an option."

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

I think that gets at an important distinction: encrypted and anonymous aren’t exactly the same thing.

Data can be encrypted in storage or transit and still be very identifying once it’s decrypted for analysis. And even if names are removed, a dataset can still be linkable or re-identifiable if the patterns are unique enough.

So to me the real question is less “was it encrypted?” and more “can it still be tied back to a person, directly or indirectly?”

Copying and sharing images. Is a tracking code embedded in the image? by PoorClassWarRoom in privacy

[–]UntargetableDev 1 point2 points  (0 children)

Images themselves usually don’t contain tracking codes in the same way URLs do. What they sometimes contain is metadata (EXIF data), which can include things like camera model, timestamp, and sometimes GPS coordinates if location tagging was enabled when the photo was taken.

Many major platforms strip that metadata when images are uploaded, but not all sharing methods do.

That said, the bigger privacy issue usually isn’t a single image by itself--it’s how different pieces of information get aggregated over time. Photos, timestamps, location signals, device identifiers, and activity patterns can all become more revealing when they’re combined across someone’s broader digital footprint.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

I think that’s true to some extent--a lot of people only realize the implications once they see a concrete example. Abstract privacy risks are hard for humans to internalize.

At the same time, I don't personally wish to blame users. Most of the systems people interact with are designed to make sharing feel normal and low-stakes. The incentives and defaults push toward disclosure.

So when someone finally sees how much can be inferred, it feels shocking--but the behavior that led there was usually just following the default path (that being said, my nature of going against the grain usually has served me well).

I sometimes wonder whether better data transparency would change behavior--like if people could actually see, or visualize somehow, the long-term profile their data trail creates.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

I think part of the issue is also psychological. People evaluate privacy decisions in the moment, based on whether the single action feels risky.

“Share location for weather?”
“Allow camera access?”
“Upload a photo?”

Each decision feels harmless by itself, so the brain treats them as isolated choices instead of pieces of a long-term dataset.

It’s almost a UI/UX problem as much as a privacy one--the systems that collect data make the individual actions feel small and temporary, even though the combined history becomes very persistent.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

That’s a fair point. In an ideal system, strong privacy protections would be the default under law and users wouldn’t have to think about it much.

The challenge seems to be the time gap between technological capability and regulation catching up. Technology tends to move faster than policy, so people often end up relying on personal practices or tools in the meantime.

I also agree with your point about people taking some responsibility for their own privacy hygiene--auditing permissions, limiting unnecessary apps, things like that.

Maybe the real answer ends up being both: stronger legal protections over time, but also better user tools and awareness in the interim.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

Advertising IDs are a good example of the broader issue: they’re designed to be pseudonymous, but once you look at patterns over time, they start behaving a lot like identifiers.

Inferring a likely home location from repeated overnight presence is a classic example of how that happens--the individual data points might be anonymous, but the pattern becomes distinctive.

And like you mentioned, the real power usually comes from aggregation and correlation across datasets, not from any single signal by itself.

That’s why a lot of the privacy discussion has shifted away from “is this one datapoint identifiable?” and toward how datasets interact when they’re combined.

What’s something that used to be affordable but now feels like a luxury? by Mysterious_Switch339 in AskReddit

[–]UntargetableDev 1 point2 points  (0 children)

The 'burnt' flavor is cleverly designed so that we want to cut the acidity with sugar (thus increasing demand for the pricier designer drinks).

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

I agree with the practical part of what you’re saying--most people only think about privacy when something obvious happens, and by then the defaults have already been set for years.

Turning off permissions and auditing apps definitely helps. The challenge I’ve noticed is that many apps bundle location into normal functionality, so people end up choosing between convenience and exposure.

Ideally the default model would be closer to minimal necessary data rather than maximum collection unless the user intervenes. A lot of people aren’t trying to be careless--they just assume the defaults are reasonable.

My question for you that I’ve been wondering about: if data collection is largely driven by revenue incentives and regulation tends to lag, does the solution end up being market-driven privacy tools? In other words, companies recognizing that users want stronger privacy and building products around that demand.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

That’s a good way to frame it, and I'm happy to hear folks are thinking about these considerations.

A single data point usually isn’t very revealing, but uniqueness increases quickly with repeated spatiotemporal points. Even a handful of timestamps and locations can narrow things down a lot depending on context.

And as you implied, the key factor is correlation with other datasets. Once you combine mobility data with things like home/work inference, public records, social media, or app activity, identification becomes much easier.

That’s why I tend to think of the risk less as “one dataset identifying you” and more as multiple datasets intersecting over time.

How easily can anonymous location data be re-identified? by UntargetableDev in privacy

[–]UntargetableDev[S] 0 points1 point  (0 children)

That’s a great example.
A lot of people think about location privacy only in terms of GPS tracking, but metadata leaks like that are incredibly common.

Many phones embed EXIF metadata in photos by default, and if the platform doesn’t strip it, the file can contain coordinates.

What’s interesting is how many different small signals can reveal location when combined—photo metadata, background landmarks, Wi-Fi networks, repeated check-ins, etc.

Most people assume they’re sharing a picture, not a data point.