The Augmented Life is doing things a bit differently this week. Instead of hosting a single guest, Michael is joined by Ash, Diana, and Amna to talk through some of the questions that have been flooding in from listeners. The overwhelming curiosity and engagement from the audience have inspired the team to dedicate an entire episode to addressing these burning questions head-on.
In this episode, we delve into the complex and rapidly evolving landscape of personal data and artificial intelligence. From the shifting policies of tech giants to the transformative potential of personalized data analysis, the conversation explores the intricate balance between harnessing the power of our digital footprints and maintaining control over our privacy in an increasingly connected world. Through expert insights and thought-provoking discussions from our team, the episode illuminates the challenges and opportunities that lie ahead as we navigate this uncharted territory and strive to live better, more empowered lives in the age of AI.
But first – if you haven't already, be sure to catch the full episode here:
Now, let's jump into the questions!
Ash: We've seen many instances of large tech companies changing their user data retention policies over the past couple of years. These companies are doing a cost-benefit analysis and realizing that the cost of storing this data and the potential liability isn't worth it compared to the value they're getting in terms of ad targeting. That's one of the reasons we built Fulcra - to store location data under the user's control. We have a solution that allows the data Google will no longer store to be kept in Fulcra. It's incredibly important for users to retain their own copy of this data because it will be very valuable in the future.
Amna: Traditional medical studies have limitations. Longitudinal analysis is difficult because tracking an individual throughout their lifetime is challenging, and cross-sectional studies might end up with a homogeneous group that self-selects for certain characteristics. While these studies have helped us gain many insights, we're finding out in this new era of personalized medicine that studies applying to one specific group may not translate to another.
Fulcra allows you to have a longitudinal analysis of yourself. You can play around with the results of a study in your own life and see how changing different parameters affects your well-being. It's a powerful tool for finding personalization in a way never done before. You can share it with your doctor or opt into a research study in a privacy-preserving way.
Diana: I strongly believe that when you track, see, and reflect on something regularly, you make it possible to actively improve it. Pick one thing that's important to you, something you want to improve, and start tracking it. You'll often find the best ways to track after some initial experimentation.
For example, when I was working on living true to my essence and values, I created an iOS shortcut that would launch a form every day at 3 PM. I'd rate how closely I was living true to my values on a scale of 1-10 and write a short sentence for context. At the end of the week, I'd share that data with my coach, and we'd discuss my progress. We're building features in Fulcra to support this type of contextual data collection and sharing.
Michael: To understand how we might control AI, we need to differentiate between its programmed functionality and emergent functionality. If we appreciate AI models for their emergent behavior, trying to functionally control them by imposing caps is likely a dead end.
A better control surface is controlling what data the models have access to. If we want to ask an AI model for help while also being able to turn it off, we need to grant them access to private data that enables them to assist us, but also be able to revoke that access.
There must be a logical separation between the AI doing inference and the private data we allow them to access. If we inextricably bind the data to the model, that's a one-way trust decision. Instead, we should use access control theory to provide these models with temporary, permission-based access to logically separated data, allowing us to change our minds and revoke access.
Diana: There's quite a bit we can gather from your phone, including step counts, calendar activities, location, and even sleep data from apps like Sleep Cycle. You can also use our custom inputs feature to track anything you want, like anxiety, and compare it to other data from your phone.
Amna: That's one of the things I love about what we've built. You can have all the wearables you'd like, but you don't need any at all. The custom features input is really useful because you can have it all on one screen instead of separate apps. It becomes a powerful, accessible tool to empower many people, not just those with numerous wearables.
Michael: I thought of myself as very well-instrumented with an Oura ring and Apple Watch, but it wasn't until I uploaded my data to Fulcra that I learned I had years of step count data just from my phone. I could look back several years and see changes in my daily activity. It was always there, but I didn't know about it until it was surfaced in my dashboard.
Ash: Without a doubt, the most significant recent development is Apple's announcement of Apple Intelligence, which will allow Siri to access and act on a lot of the Apple data on your phone. It's a continuation of the trend where every platform with access to your data rolls out an AI-enabled feature that tries to utilize that data, but it's still fairly siloed.
Over time, users will realize this data has been incredibly valuable for ad targeting and will also be immensely valuable to them. Keeping the data locked up on various platforms means they can't make full use of it. People will push to break the data out of silos and put it under their control.
Truly helpful AI assistants will require access to personal, contextual data. That will be a significant development over the next five years, even though they're fundamentally unpredictable.
Michael: Personal computers have poor security boundaries. If you can run a successful ad fraud botnet, you can infect tens of thousands of computers with malware, turning them into drones doing bad things while their owners are unaware.
If you're worried about your email account being compromised, it's much easier for an attacker to trick you into installing malware on your device than to break into a major email provider directly. Statistically, your email is more likely to be compromised by malware on one of your logged-in computers than your email service provider getting hacked.
Public and private clouds can employ security measures substantially more sophisticated than consumer-grade antivirus and security barriers on your devices. An enterprise security team protecting cloud infrastructure can employ defense in depth, presenting a dynamic, moving target to adversaries.
At Fulcra, we're providing enterprise-grade security solutions to consumers who want to tie their multi-platform lives together, much like great SaaS companies offer dedicated private instances for customers with special security needs.
Amna: It's a mix. Certain groups of people care a lot about privacy, somewhat because it has become a buzzword, but also because it's a genuine problem we're facing with data leaks and the amount of data advertising companies have about us.
But then there's another set of people who, when they understand how much data is already out there about them, just say, "I don't care. I'm going to make it worth it for myself. If a corporation's using it, I may as well use it to make myself better and focus on my goals."
We've kept both viewpoints in mind while building Fulcra, ensuring privacy is incorporated from the ground up and users feel empowered.
I hope this episode has given you a deeper understanding of the driving force behind Fulcra and our mission to empower individuals through their personal data. It was a pleasure diving into your questions and sharing some of our own experiences and insights along the way. Please, keep the questions coming!
You, our listeners, are the reason we do this work. Your curiosity, experiences, and ideas are what propel us forward and shape the future of Fulcra. So drop a comment, slide into our DMs, and let's keep this conversation going.