Why the TikTok Ban Was Never a Privacy Victory and The Outlook for Data Privacy in 2025
Over the weekend we witnessed the rollercoaster that is government overreach—the TikTok ban was and will always be utterly ridiculous.
If I choose to use TikTok, as long as I'm going in informed and with eyes wide open, I don't need the government to protect me from own choices. I realize I am not alone in this opinion, but I do think it needs to be reiterated that I'm a big believer in freedom of personal choice, and I have a healthy suspicion (and aversion) to the government telling me what I can and can't use.
What's particularly troubling about the entire ruse was the lack of specific evidence—there were no actionable or specific details shared about what TikTok was doing with our data to support the ban’s arguments. The “they're collecting your data, and they might do something with it” argument is tired at best. Because EVERY tech company is collecting data in some capacity, including some of the largest culprits, Meta and Google.
The fact that it's China? I will still need tangible evidence of real malice to move this argument from fear-mongering to actual threat. With the current information provided all I can expect is that the app is sophisticated enough to advertise to me better. Can TikTok be used for misinformation? Of course, and so can every other social media platform, including those managed by U.S. companies – we see that every single day. That remains my stance until there is evidence and data to prove something more nefarious is happening at ByteDance and TikTok.
2025’s Data Privacy Predictions
This does not mean I am not in favor of privacy reform regarding your digital footprint.
Looking at 2025 and beyond, I hope to witness a shift toward truly meaningful privacy reform. This shift will require legislation, but NOT in the form of completely disconnecting consumers from the product or technology … or personal choice. If you look at Europe, they're almost 10 years ahead of us when it comes to data privacy. There's a much deeper awareness there, both at the public level because of GDPR (General Data Protection Regulation) and the discussions it sparks. However, I'd argue the pendulum across the pond has swung a bit too far, to the point where it's almost impossible to implement and enforce all the rules.
They're complex and bureaucratic.
Which Means It Comes Down to Personal Responsibility
My take? You are in charge of your data and have to take responsibility for what you share.
Yes, ideally, companies should be held to privacy standards and guidelines, but ultimately, it's on us as sovereign individuals to be proactive about our digital privacy and what it entails. And before you get overwhelmed by this statement, I can assure you that this responsibility is manageable. You do not need a PhD in software development to understand your privacy and your data. There are plenty of tools out there today that you can use to protect your privacy. While browsing the internet, you can operate completely in incognito mode. You can turn on VPNs. On Apple devices, there are numerous settings you can adjust for ad tracking. If you're truly concerned about privacy, the tools are there – you just need to learn how to use them.
What Should the Government's Role Really Be When It Comes to Data Privacy?
Instead of broad, unenforceable bans (similar to the 16-hour one we experienced over the weekend), I believe the government's role should be focused on holding companies accountable for doing what they say they are going to do. Companies should be living up to the contracts and agreements they have with their end users. That is where government oversight makes sense and is beneficial to you, the consumer.
In addition to that, I’d like to see companies required to be more explicit and transparent about their data practices. For example, imagine a simple T&C page that says, "Here's exactly what we are going to do with your data: We are gonna collect information about what you're viewing, where you're at, how often you're using the service, and we are going to take that data and improve the ad experience for you." No legal jargon, no hidden terms—just transparency.
As we move through the new year and past 2025, I'm particularly concerned about deepfakes and digital authenticity. These are the real challenges we need to address when it comes to security. The technology to combat these issues exists, so it’s not a tech problem to solve, it's an adoption problem.
The tech industry needs to take more responsibility for solving these problems. Companies like Meta, Apple, and Google do not want deepfake content on their systems, and they have very smart people working on solutions. But it's challenging to implement these solutions at scale. This is where the government could potentially help nudge towards the adoption, but the fear is that as soon as you ask the government to step in, they'll be heavy-handed (and bureaucratic).
TLDR: There Is No Simple Solution
Banning certain technology isn't going to solve our privacy problems. Instead, my hope and push for the future is for better resources to create educated consumers that can make informed choices about their data, supported by companies being transparent about their practices and the government providing appropriate legislative oversight rather than blanket restrictions.
If this weekend’s events have taught us anything, it’s that if you're concerned about privacy, take action. Educate yourself. Use the tools available to you and do not wait for the government to solve this problem.
The power to protect your privacy is largely in your own hands, proceed with caution but with the freedom to choose.