In this podcast recorded at QCon San Francisco 2019, Shane Hastie, Lead Editor for Culture & Methods, spoke to Ayana Miller of Pinterest on privacy & data governance and Julia Nguyen about her open-source project if-me.org for supporting mental health
Key Takeaways
Key takeaways (Ayana Miller):
- Privacy and data governance should be designed in from the beginning of any product development
- Privacy engineering is an emerging discipline
- Generally, the industry and the systems we build are not optimized to be privacy aware
- Privacy implications go far beyond adding a question to a user profile
- There are no silver bullet privacy solutions
Key Takeaways (Julia Nguyen)
- You can't make an impact on other people unless you take care of yourself first
- If-me.org is a mental health communication app. It's a way where you can get peer support when you're going through a difficult time or when you want to share any type of mental health experience with a trusted community
- If-me is open source because mental health should be open
- More important than building the app, building the community around the app has had the biggest impact
- It's important to a step back and acknowledge that you're human and you matter because you're human and you deserve a really awesome life
Subscribe on:
Transcript
00:05 Introducing Ayana Miller
00:05 Ayana Miller: Good day, folks. This is Shane Hastie for the InfoQ Engineering Culture Podcast. We're at QCon San Francisco 2019 and I am sitting down with Ayana Miller from Pinterest. Ayana, welcome. Thanks very much for taking the time to talk to us today.
00:19 Ayana Miller: Thank you for having me on the show, Shane.
00:21 Shane Hastie: It's great to meet you. You're talking on the ethics regulation risk and compliance track, and you are giving a presentation, managing privacy and data governance for next generation architecture. That's a mouthful.
00:35 Ayana Miller: Yes, it's a lot to take in. What that actually means is there's a lot of new vendors in the privacy space, given the attention on the European Union general data protection regulation, and soon to come out, the California Consumer Privacy Act. And so there's a lot of vendors. How do you manage those vendors? Who do you need to have in the room to make decisions about privacy? And where does ethics come into play and all of that?
00:59 Shane Hastie: What brought you into this space? Maybe a little bit of your background?
01:02 Ayana Miller: Sure. I came into working in privacy pretty early on, about a decade ago. I was working as a systems engineer in the D.C. area for a company called MITRE. I happened upon an area called privacy strategy around the same time Edward Snowden revelations happened. And I thought, "This is a cool space. I want to get involved and I want to do more." I thought that privacy was a good area that combined my background in policy. I went to grad school at Carnegie Mellon University. I studied public policy and management. So I knew how to apply quantitative analysis to policy decisions. And privacy it seemed like the best way and one of the good areas to do that in a technical capacity.
01:41 Ayana Miller: I thought if I work consulting for a few years, maybe I'll go in-house and then maybe a Facebook or Google will be interested in hiring me. I ended up going and working for the Federal Trade Commission as a government employee. I was only there for six months when Facebook poached me and I worked as a privacy program manager. And then I worked at Snapchat as a privacy engineer. And most recently I'm at Pinterest as a privacy and data governance technical program manager. So I've had an exciting career.
02:09 Shane Hastie: Privacy and data governance technical program manager, huge space. How do we make it consumable?
02:17 Focusing on the Consumer Experience
02:17 Ayana Miller: Yeah. So I think you boil it down to thinking about the consumer experience and what data are you deriving? What use are you getting from their data? And so I approach all of that from the user experience. What are the flows of information? Where do they go in the system? And what ability do we give users to be able to request that that information be deleted, updated? Can they have a copy of it that they might share somewhere else? Those controls and that whole entire narrative of the customer journey as it pertains to their data is what I'm interested in.
02:48 Shane Hastie: As an industry we've not been terribly good at this up until now?
02:52 Ayana Miller: Our industry and systems are not optimized for privacy That's true. And if the industry is not optimized, our systems are not optimized to think about privacy. Systems are not privacy aware. So privacy by design isn't a new concept. But the idea of there being privacy engineers that you have in a team is still relatively new. And it's exciting to be as part of the infancy of this area.
03:12 Shane Hastie: What would a privacy engineer on the team do?
03:14 Ayana Miller: The privacy engineer role One of the immediate needs probably would be helping to implement some of these regulations like the California Consumer Privacy Act. For instance, it requires a button if you sell user data. So we're talking about a slider. Some of the regulations suggests that it should be on the homepage and basically give users the ability to opt out of the sell of their data. Okay, fine, you get a developer to create a button for the home page. That's fairly easy ask if you can steal one of their time from another project or you tell them it's mandatory.
03:44 Ayana Miller: The harder part is deciding downstream, once a user has opted out of that selling of their data, what downstream systems need to listen to that and what actions need to take place in order to make sure that the information isn't shared? So that is more complicated, but that's something that a privacy engineer could help design and think about and consider. What are all the databases where that information is sent and it flows where it may need to be extracted, or you signal to the systems that are using it, that this user is out of scope for that request.
04:17 Shane Hastie: That's the mechanics. And there's the compliance component. What about the ethical elements?
04:20 Ayana Miller: The ethical considerations require a cross-functional view Yes. So there I would say when I think about a privacy program, which I think is what you're thinking about, it's definitely more than just privacy engineering. It has to be cross-functional, it's multidisciplinary. So you have the legal team, you have your chief architect, you have engineers, you have IT, you have security. All of those players in the room. I actually just established a board called the privacy and data security governance working group, where I'm able to convene all of those leaders and ask some of the hard questions. Even before you design a system, do we actually want to collect this data? What is the risk associated with collecting it?
04:57 Ayana Miller: What is the value associated with collecting it for the company, vis-à-vis the user? And so yes, those ethical questions do come up. My personal belief and actually what I shared in my presentation and my talk is that a lot of the ethical decisions should actually happen at the highest levels, ratified, set it in stone if you will what the company believes in and what they will not do and what they will not tolerate, make it very clear. And then based on that, allow groups like the data governance working group to operationalize that and come up with system and architectures for the next generation that take those considerations into mind.
05:31 Shane Hastie: This sounds wonderful. It sounds so good. Are we as an industry doing anything like this really?
05:41 Ayana Miller: This is not an easy change for companies It's challenging for sure. I've seen across a few companies now, especially in social media and tech them grappling with new regulations and how to respond. Every company has their own set of issues, depending on their maturity. It takes a long time to change legacy systems and work through tech debt, especially if you weren't as I mentioned before, optimizing for privacy by design. How do you make sure that you're deleting data by request within 30 days? That's difficult to do. It's also expensive to delete by rows information about a user.
06:14 There is no silver bullet
06:14 So there aren't easy answers. And that's why I always question any vendors who come into the space and say that they have this silver bullet. It's a word that's come up a lot in my world. There's no silver bullet. And I think that's true. We're seeing a lot of systems that are built in-house. You can compliment it with industry solutions, with the vendors we're seeing. But a lot of it is going to come from the developers who have been in the trenches and understand how the systems were built and how they interact together, especially in multi-tenancy situations and using multiple services and apps. We don't live in a world where there's monoliths of our systems.
06:49 Shane Hastie: Let's go deeper to the individual. So the organization has a policy. What if those policies at the organization level aren't that good from a moral, ethical perspective? What do I do as the individual engineer on the ground?
07:05 What can an individual engineer do?
07:05 Ayana Miller: Yeah, so there's lots of different solutions that I've tried, some more successfully than others. One of them is a privacy and security champions' approach. Privacy and security champions is a monthly meeting I have to talk about issues in privacy and security. This is what I would say is an example of not necessarily something that's been successful, it's not well attended. People who do attend are very engaged and care, but what about the rest of the engineering population and culture? How do you get them interested if it doesn't align with their business objectives and their goals? So I got creative. Two things that I've started doing. One is I have the advantage of working in tech companies where our CEOs often have Q&A's. When I'm really lucky it's at the end of the week every week. And in other companies it might be quarterly, but what I've done is gotten bold about going up and asking questions about privacy.
07:53 Ayana Miller: It gets their attention when you put them on the spot, ask them questions. I've been able to get their buy-in and get them to provide a sponsor for the work that I do. So at least I have some visibility and transparency and communication into the C-suite. That's been very successful. The other thing that's been successful for me is working with the business. So even if there is a use case that I've been fighting for that it may not have the buy-in yet, or I may not have resources or the team to be able to get it where the company is ready to embrace it. Something I found successful is documenting it, jumping on when a business use case develops that aligns with that. So one example of that is maybe a new data set that comes in, and we're not able to process it because of a contractual agreement that says it needs to be only accessible to a certain group of people within the company.
08:44 Link privacy to business value
08:44 Ayana Miller: If we don't have the architecture to be able to support that, then there's an opportunity to make a case that we should set up the architecture to be able to unlock business value. That's not a privacy driving reason. That's not a compliance reason. It's a business value. Once you're able to connect those dots, then you can help talk about the vision, talk about the roadmap for the future and how you can build that architecture. But you can also advocate on behalf of that product business case to get a temporary solution. Maybe it's setting up a clean room or figuring out a vendor solution that you can set up temporarily. But once you have that support, then that product team, those engineers will be more open to supporting you in the future for helping to build privacy by design as I call it.
09:25 Shane Hastie: What about the unintended consequences? How does this go wrong for us?
09:29 Examples of unintended consequences
09:29 Ayana Miller: Yes. It's something I think about a lot. There are definitely great things about where the industry is going and the attention we're paying to privacy and security. One good example of this is end-to-end encryption. People are now saying, "Oh, we need to end-to-end encrypt message data." Great. It is a great solution. There have been products on the market for a while and apps that allow you to have private communications. It's a great privacy factor and benefit. But on the flip side of that, when you give the ability to have end-to-end encryption, you take away any visibility into products and services or bad actors. So the use case I'm concerned about, the adverse reaction is child endangerment, people who would target children. We wouldn't have any visibility and to be able to stop them.
10:14 Ayana Miller: State terrorists or state actors are another use case where if you don't build in things like a back door or have the ability to view messages and understand threats, then it's basically going to hurt the population. So I think there definitely is an ethical question about you can provide certain services for the consumers that are privacy aware and provide benefits, but you also want to be aware of some of the adverse consequences and how you can overcome those.
10:40 Shane Hastie: So how do you?
10:42 Ayana Miller: I don't know, that's a hard question to answer. I think that it makes sense for companies to have a ability to control. So having need to know type of culture. So I think end-to-end encryption is great. Having the ability to have a backdoor to that where it's very limited and you have detailed policies that explain in what use cases you're able to go around that, unlock it, decrypt the information. I think that makes sense. There has to be a balance. And I think we're still learning. It's everything's happening so quickly. But I think in the future we'll see solutions that are able to be respectful of privacy and aware, but also adaptive and responsive to bad actors.
11:22 Shane Hastie: So where is this leading us to in the future? So you're talking about those sorts of elements in systems, but what is this leading us to as an industry? Where are we going?
11:32 Where is this leading us to as an industry
11:32 Ayana Miller: It's an interesting question. I think we will continue to see more privacy products in the market, but we will need to have advocates for them. That has been lacking so far. The government and many governments now are starting to scrutinize the tech companies in particular. But across all the industries there's all of these alphabet soup of regulations, at least in the United States, thinking about privacy. So there's questions about whether there will be a federal legislation. What will that look like, I think is a big question. In the absence of that I think that we will see the people who have the ability to protect their information continue to do so and continue to be willing to pay to do so. I'm thinking in particular about VIP's and high profile individuals, people who give a lot of information away already. So that's an area I'm personally interested in.
12:22 Ayana Miller: Special tools for high-profile people. I also have seen how we offer in tech companies a special suite of tools for these individuals, because their circumstances are different. But what ends up happening is that a lot of these tools end up being available for the public after they'd been tested and tried. So for example, things like two-factor authentication, which is now pretty standard, and you'll see more and more apps using it, more companies using it, using face ID in collaboration with that. Those were things that VIP's have had for a while and they've been testing out because they needed those additional security measures. And now we're seeing even 2FA has vulnerabilities, there’re smishing attacks and SIM jacking. And so we'll continue to see new approaches, but I think there's a lot to be learned from the VIP use case that can then be shared with the entire population. Another example of that is the Google Advanced Protection Program that they've launched for journalists and other high profile people.
13:17 Ayana Miller: They have different needs. And so now Google is thinking about ways to protect them when they're traveling abroad, making sure that they have security keys, setting up all of the factors, all the touch points, where they could potentially be vulnerable. I think we'll continue to see more consumers carrying things like security keys around and having an awareness that they need to have multiple factors to authenticate into their devices and into their apps.
13:41 Shane Hastie: Ayana, if people want to continue the conversation, if they want to explore this with you, where do they find you?
13:46 Anyana’s Contact Information
13:46 Ayana Miller: I'm always on LinkedIn. I'm ayanarmiller on LinkedIn. I'm also on Twitter, ayanarmiller and Instagram. I share some privacy tips and tricks for maintaining your social media while also being private and secure.
14:00 Shane Hastie: Thank you so much for taking the time to talk to us.
14:02 Ayana Miller: Thank you for having me.
14:03 Introducing Julia Nguyen and if-me.org
14:03 Julia Nguyen: I'm at QCon San Francisco and sitting down with Julia Nguyen. Julia, thanks for taking the time to talk to us today. You spoke on the socially conscious software track and your talk titled Impact Starts with You. But before we get into that, would you mind just giving us a bit of your background and introducing yourself to us?
14:25 Julia Nguyen: I'm a community organizer, senior software engineer, writer, public speaker, keynote speaker, and storyteller.
14:40 Shane Hastie: What brought you to telling stories at QCon?
14:40 Julia Nguyen: Alex Qin the track organizer for the socially conscious track actually invited me. I've known her on the internet for a number of years now. I'm really big fan of her work with Code Cooperative. So yeah, she reached out to me. I've actually spoken in the past for QCon, I think back in 2017 at New York, it wasn't for the socially conscious track. It was a web track that Phil Haack was running. So I was giving more of an engineering focus talk.
15:06 Shane Hastie: Your talk, Impact Starts with You. Do you want to tell us a little bit about it?
15:11 Impact starts with you
15:11 Julia Nguyen: Yeah. I've been doing mental health activism and maintaining my open source mental health project if me, which I founded back in 2014 for over five years now and just been at a stage where I've been like doing a lot of self-reflection on it and also been working as a software engineer for a while right now and also doing a lot of self-reflection on it. I've been definitely burnt out by both things and just stepping back and thinking about the impact I've made and also why I'm so focused on making an impact. Impact is a word that I've been thinking about a lot lately. I feel like it's kind of becoming a buzzword now. People are commodifying it, talking about. At work people constantly talk about impact. In the social justice space and activism space people talk about impact as well. So just stepping back and thinking about what impact means.
16:02 Julia Nguyen: I've been really burnt out doing a lot of this work over the years. And it's kind of become a cycle where I get burnt out, I step back, recharge and then the cycle continues. And I'm starting to think about, "Is this sustainable and why are we so focused and fixated on impact?" And in my talk I kind of delve into that a bit more. I talk about how in general we live in a world where we're all brands online, we're all influencers, thought leaders in some shape or form. I think about the internet of the 2000s for instance. And like everyone was an alias. Everyone was an avatar.
16:38 Julia Nguyen: In todays environment everyone is an influencer and has a brand But if we look at the internet today, people are their first and last name. People are defined by where they work, where they've travel to, what kind of side projects and side hustles they're working on. And going back to the word impact, just realizing that impact in general is a really beautiful thing. I think it's kind of a human thing to want to make an impact on this world. Like we're going to die. It sounds blunt, but it's true. But how this day and age impact is a commodity, it's something that you advertise online.
17:11 Shane Hastie: How do we fix or how do we improve that? How do we get away from that commoditization and make it what we intend it to be?
17:22 Take care of yourself first in order to be able to take care of others
17:22 Julia Nguyen: Yeah, it's a really good question. I don't have all the answers there. I feel like if I did I'd be a lot happier, but I think my talk title summarizes it like impact starts with you. I think it's really important to think about yourself, but not in a selfish way. Thinking about how you can take care of yourself, having done mental health activism and maintaining this open source project that aims to be inclusive of all types of contributors. I'm constantly focused on impacting other people and having an impact on other people. And when you do that constantly, you burn out, you neglect your mental health, you suffer. So this whole idea of impact starting with you is this idea that you can't take care of other people. You can't make an impact on other people unless you take care of yourself first.
18:09 Julia Nguyen: And it's important to set boundaries. I think more and more people are becoming cognizant of this. People are realizing that social media is really toxic and it's important to take breaks. People are getting burnt out all the time at work now and more and more getting encouraged to take time off. So I think when it comes to thinking about impact, from a me perspective it's thinking about how you don't have to save the world. You don't have to save everyone in order to be a good person. Being there for yourself is actually a very powerful thing that you can do.
18:43 Shane Hastie: Stepping away from the talk today. You mentioned the open source project if me, tell us more about that. And let's explore if we may, why?
18:53 If me – bridging mental health and technology
18:53 Julia Nguyen: I started the project back in 2014, I was an undergrad student studying computer science in my second year. Had done a lot of internships and did one where I got to work in the social impact space. And a few of the projects I worked on were also open source. And it was the first time in my mind that I saw that technology could be used for "good" and it blew my mind. And that same year I was becoming more comfortable with talking about my own mental health and my own struggles with mental illness. So, I think the timing worked out really well. And so I really wanted to start exploring the idea of bridging together mental health and technology. So if me, it's the mental health communication app. It's a way where you can get peer support when you're going through a difficult time or when you want to share any type of mental health experience. And it's a close community where you can interact with your loved ones.
19:46 Julia Nguyen: So it really started off as a tool that I used for myself. I was going through a time where I was starting to write more about my experiences and wanted to keep my loved ones, my family and friends up to date. So I started using this app to share that stuff. And then, I don't know, just through word of mouth, talking about what I was building with people around me. They were like, "Hey, you should maybe make this into a startup or become an entrepreneur. This is like such a great idea." And I don't know, that was something that I wasn't really interested in. In my talk I kind of delved more into my background, but I grew up in a low-income family and the daughter of Vietnamese refugees. And I was diagnosed with mental illness at a pretty early age and just navigated the mental health system pretty early on and just saw the issues and navigating the system when you don't have money, when you are part of minority group and among other things.
20:38 Julia Nguyen: And the idea of creating a startup out of this idea I had just wasn't appealing to me. So I started thinking about ways I could make the project bigger, but without having to almost sell out. And then coincidentally was working, doing an internship where I was contributing and helping to maintain open source projects. So it kind of clicked in my head like, "Oh, I could make it open source." And yeah, that's kind of how it became an open source project. And since then it's grown to have hundreds of contributors. We've translated to multiple languages. Our translations work is actually really important for us. For me growing up in a refugee and immigrant community, first-hand seeing that healthcare in general is pretty inaccessible if English isn't your first language. So it was really important for me to translate the app into different languages.
21:29 Encouraging diversity in the contributor community
21:29 Julia Nguyen: And these days I spend a lot of my time doing community organizing within my project. So mentoring contributors, helping them achieve what they want to achieve. Something that's really good about our project is that we really try to be inclusive. Not only do we include developers of different backgrounds, different experience levels, but we also acknowledge that contributors don't have to be developers. They don't have to be coders. A lot of our contributors are people who are just interested in making mental health better, and people who are advocates and advocates. So I spent a lot of my time actually communicating with different people and finding ways to get people to work together to build something great.
22:09 Shane Hastie: What have you learned from having this open source project for five years?
22:13 Lessons learned – Community is the most important thing you can build
22:13 Julia Nguyen: I've learned a lot. I think the biggest lesson I've learned is that community is the most important thing that we can build. And the app itself is a community platform. So it's always important to build tools and features within your apps or projects that build safe communities that provide tools that people can be authentic in a safe way. But I also think when you are building a mental health project or any type of mission driven social impact project, it's important to think about that also internally. So I think for me, the most important thing that I've learned is the importance of building a community within your contributor community and empowering people to take on leadership to explore what they need to explore. And to also recognize that contributors come and go in open sore. You can't have them forever and that's okay. And that actually is kind of a good thing because it allows you to find creative ways to show appreciation and to appreciate their impact.
23:12 Shane Hastie: So what's next?
23:14 What’s next for if me?
23:14 Julia Nguyen: I feel like throughout the years our code base has definitely gotten a lot cleaner and better. We had this huge refactoring the past two years, we migrated to react and spend a lot of time investing in improving our test coverage. I think our test coverage is at like 94% now, which is pretty awesome. I think from an engineering standpoint, just like just improving our code base. And us improving our code base we actually provide a lot of opportunities for developers to learn new skills. So it's been a really great way to teach good engineering practices to people who are new to the industry or new to coding. From a contributor standpoint I want to continue mentoring people. I find it really rewarding. We used to participate in Rails Girls Summer of Code. They're an amazing initiative. They provide three month internship programs every summer where women from around the world can contribute to open source projects and actually get paid for it.
24:07 Julia Nguyen: So just being involved in more initiatives like that is really rewarding. I think when I first started out the project, I had this lofty idea of saving the world and disrupting the mental health field and all that jazz. And now I've come to realize that I haven't been able to do those things, but I've been able to make impact in different ways. And the impact that I really value when it comes to my project is just being able to guide people to contribute and get what they want out of it. A lot of people who've contributed have been new to the tech industry and it's been a really great opportunity for them to build their resume and just get real life experience that they can apply to when they work at a company or something. So, I think just continuing that, I don't have any lofty goals to disrupt the mental health care industry or that, but just more cultivating more conversations with people.
25:01 Ideas and impact shouldn’t be commodified
25:01 Julia Nguyen: Another really beautiful thing that has come out of our project is a lot of connections with other mental health organizations or non-profits doing similar work or doing adjacent work. It's been really great to connect with people. And it's really also taught me that ideas and impact are things that shouldn't be commodified. People shouldn't own ideas, ideas should be shared and should be open as the open source model kind of mandates. So yeah, just continue to have those connections and also amplify other people who are doing awesome work.
25:33 Shane Hastie : Where do people find you?
25:34 Julia’s Contact Information
25:34 Julia Nguyen: I'm on the internet. You can follow me on Twitter @fleurchild. In general I've been trying to sort of cut down on my internet interactions, just for my own self-care and my own mental health. But I will post there if I have something important to say. I do have a website as well, julianguyen.org or julia.tech. if you can't type my last name. You can find me on LinkedIn. I don't check that as much, but you could add me.
26:01 Shane Hastie : And where do people find if-me?
26:03 If-me.org
26:03 Julia Nguyen: Our website is if-me.org. On Twitter, we're @ifmeorg. We also have a Facebook page, but I don't think anyone uses Facebook anymore. We're also on Instagram at ifmeorg. Through our website you can find links to our GitHub. You can also email us as well.
26:20 Shane Hastie : If there's a final message for the audience, what is it?
26:23 Julia Nguyen: I was just say it's hard, but I think it's important that we constantly remind ourselves that we matter, regardless of the work that we do. I feel like in the tech industry, we can be really subsumed by our careers, by our titles, by our salaries, by all of these very arbitrary things. I think it's important to a step back and acknowledge that you're human and you matter because you're human and you deserve a really awesome life. So yeah, just take care of yourself.
26:51 Julia Nguyen: What a wonderful point to close on. Thank you so much.
Mentioned: