More: No, Facebook doesn't secretly listen via your microphone to target ads at you. TRUE: Facebook tracks your location for several reasons, including so that you can "check-in" to certain places, letting your friends and family know where you are or where you've been. Facebook also uses your location to serve up ads. Facebook says people grant apps access to know your location when you sign up for them, and they can use this information to show locally relevant content or ads.
You can control this. More: Why you should think twice before you 'sign in with Facebook'. Advertisers don't get your specific personal information from the network, however, and Facebook has stressed that distinction. John Cornyn, R-Texas, during his testimony last week.
But the uproar over the Cambridge Analytica revelations has forced it to make changes to how advertisers use Facebook, giving a window into how this ad-targeting worked. In late March, Facebook said it would end an arrangement that had allowed advertisers to target Facebook users with information gleaned from third-party data brokers, such as their offline purchases or public records.
Advertisers can still upload information they already have to aim ads at people on Facebook. This is an opt-in feature, meaning you have to specifically give Facebook permission to do it, and it is only available on Android phones.
Facebook says it offers call-syncing so you can more easily find your friends and family on Messenger by pushing frequent callers to the top of your contact list. It does however, note the date, time, and who you were talking or texting with. At a minimum, we need radically more transparency and we need to, as a society, think about how can we not be dependent on whistle blowers like me to get basic information out of the company. Facebook has told us, "You can either have growth or engagement.
And now we actually have numbers saying, guess what? Facebook is trading off very small decreases in engagement for huge consequences in misinformation and hate speech and violence. Now we have those things documented. There are many internal documents that talk about the idea that the trade-offs that people are willing to accept. The second thing is we need to have different regulations on engagement based ranking. Engagement based ranking is always going to prioritize the sensational.
It's always going to prioritize misinformation. And we need to take interventions to reduce virality, to make things less growth optimized. Because we could have social media that was about our family and friends that we really enjoyed, that was less toxic.
It's just Facebook would grow slower. People would spend shorter sessions on Facebook. Facebook would make less money. We have to regulate it to get that world. Kate Linebaugh: Facebook spokesman said the company's incentive is to provide "a safe, positive experience for the billions of people who use Facebook. That's why we've invested so heavily in safety and security.
And to suggest we encourage bad content and do nothing is just not true. Frances Haugen: Magic bullets are always dangerous because they don't exist. Let me think, one change. If I could only do one thing, I would improve transparency. Because if Facebook had to publish public data feeds daily on the most viral content, how much of the content people see is coming from groups?
How much hate speech is there? If all this data was transparent in public, you'd have YouTubers who would analyze this data and explain it to people.
Frances Haugen: When people ask me, should we break up Facebook, I say, definitively do not break up Facebook. All you will do is starve the individual parts of resources. And instead of being able to collaborate across those companies to figure out strategies to solve problems, you will divide up the teams and make them less capable.
One thing that I really, really want to emphasize is that a lot of the problems that are outlined here are not Facebook problems. They are problems with engagement based ranking. That when we allow algorithms, when we allow AI to choose what we get to see and don't see, we need that same kind of system for all social media companies, because that's the only way we're going to get systems that are even minimally safe enough.
Think of how many people work on regulating cars. I don't even know what the number is. Imagine if there were a hundred people who were paid by the public or half paid by the public and half by Facebook who were embedded inside of Facebook, who could ask these questions themselves. Frances Haugen: With Banking. We do this for bank. Oh my goodness. We don't let banks run themselves.
Algorithmic governance. We need more thinking on algorithm governance. Kate Linebaugh: I'm on Facebook because my son has a medical condition, and there's a group of parents who found each other on Facebook. They came up with a way to use medical devices together that would improve patient outcomes.
It would not have existed without Facebook. Frances Haugen: Yeah, totally. Ugh, amazing. I love open source. I love open source medical things. I love it, because then you can step in and say, "Guess what? We're having an argument about A or B. Guess what? There are a C, D, and E. Let's take a step back and imagine what Facebook could look like if it was safer.
I'm not asking you to give up your amazing open source medical devices group, because I agree with you. Those things change the world. But what I am saying is instead of having a product where you have a group with , people in it, and every day that group makes a thousand posts and we're going to trust the AI to pick three posts from that group and put them in your newsfeed.
We know the solution, what happens in that world. The algorithms, because they are juiced by engagement, end up picking the most extreme posts out of that. A thousand posts each day. Let's imagine a different world. In a world where we designed social media such that it's less relying on algorithms to pick what we should focus on, normal social interactions will regulate what we talk about.
We should have humans through our conversations, our normal interactions be the things that are choosing what we focus on, not machines. Humans over machines. If you and I were having a conversation and I kept talking about the same thing over and over again, at some point you're going to walk away from me, right? If I bring up at Thanksgiving dinner too many times some crazy conspiracy theory, you're going to be like, "Hey, we've talked about that long enough.
Let's move on. Let's show it to you more. Part of the reason why these teen girls are getting eating disorders is they one time look up weight loss and the algorithm's like, "Oh great.
We'll keep showing you more and more extreme weight loss things. Kate Linebaugh: In response to our story about Instagram, Instagram head Adam Mosseri said making fixes make things worse unintentionally. Kate Linebaugh: Well, they're working on some fixes that he told us about, but it was sort of cautionary. You can be prescriptive out there in the wild, but what you are ordering up could end up hurting this product.
Frances Haugen: Let's take a step back. Part of the reason why Facebook has made lots of these choices is because they know for each one of these choices, people engaged with the product a little bit more. But the content you consume you might enjoy more because like parts It's kind of like fast food. They've been feeding us french fries.
And ugh, french fries are delicious. Frances Haugen: So good. Talk about a perfectly designed product. Yeah, you would eat less, but you'd probably also feel better. This comes back again to why I said like if I could only choose to fix one thing, the thing I would fix is transparency.
When Adam Mosseri waves his hands and says, "Some things might be worse," who gets to define the yard stick? Imagine if there was like a hundred people who got to define the yardstick, instead of Adam Mosseri saying it might be worse for people.
Kate Linebaugh: Frances resigned from Facebook in May. She's moved away from California and is now focused on other tech projects and on working with lawmakers to regulate social media. How do you feel about Facebook now? You you've released these documents.
You have strong feelings about the company. Do you hate Facebook? Frances Haugen: Oh, no, no, no, no, no. A thing that I want people to remember is to do this project, I had to do a lot of work, to document the things I documented at the level I documented. It took a lot of work.
You can't do those things if you're driven by hate, because hate burns you out. If I could work at Facebook again, I would work at Facebook again, because I think the most important work in the world is happening at Facebook because we have to figure out how to make social media safer. Kate Linebaugh: Some people at Facebook may see your decision to release these documents as betrayal. Frances Haugen: I totally see how they could come from that perspective, and all I want them to know is that one of the most important things I learned at Facebook I had a manager who was amazing.
He's like a role model for who I want to be as a leader, right? And at some point I was working on some problem and I ran into a roadblock and I got delayed. I was like a week late to give him something I promised him. He said to me, "I'm really disappointed in you. Because if you had told me you were struggling with this, we could have solved this problem together.
It is better to solve problems together than solve them alone. And that Facebook has been struggling because a lot of the problems it needs to solve are about conflicts of interest, right?
Conflicts of interest between public safety and profits and growth. Those are problems that Facebook cannot solve alone. And that once it starts solving those problems together, it'll be so much more constructive and the path forward will be so much easier. Kate Linebaugh: I appreciate you taking all this time.
In an internal message sent to Facebook staff and leaked to the media, a Facebook executive said the company will continue to face scrutiny. Some of it and some of it unfair.
But he said, "We should also continue to hold our heads up high. You and your teams do incredible work. Our tools and products have a hugely positive impact on the world and in people's lives. Thanks to Jeff Horwitz and Brad Reagan for their help with this episode. Your hosts are Ryan Knutson and me, Kate Linebaugh. Our engineers are Griffin Tanner and Nathan Singhapok. Our theme music is by So Wiley and remixed by Peter Leonard.
Additional music in this episode from Blue Dot Sessions. Fact-checking by Nicole Pazulka. Thanks for listening. See you Tuesday. Kate Linebaugh is the co-host of The Journal. She holds a bachelor degree from the University of Michigan in Ann Arbor and went back to campus in for a Knight-Wallace fellowship. Ryan Knutson is the co-host of The Journal.
He was also a regular author of A-heds, including one about millennials discovering TV antennas. He grew up in Beaverton, Ore. The Journal. The most important stories, explained through the lens of business. A podcast about money, business and power. Hosted by Kate Linebaugh and Ryan Knutson. Full Transcript This transcript was prepared by a transcription service. Kate Linebaugh: And how its algorithm fosters discord.
Frances Haugen: Cool. One second. I'm going to do the sound. Kate Linebaugh: Yeah. Frances Haugen: Yeah. Kate Linebaugh: Oh, nice. Can you introduce yourself? Frances Haugen: Sure. My name is Frances Haugen. Kate Linebaugh: You also have been a ranger at Burning Man? Frances Haugen: I am. For example, as we previously announced, we're exploring ways for people and businesses to communicate using WhatsApp, and this could include working with the other Facebook Companies to help people find businesses they're interested in and communicate with via WhatsApp.
In this way, Facebook could enable users to communicate via WhatsApp with businesses they find on Facebook. So if, for example, any member of the Facebook Companies discovers that someone is using its services for illegal purposes, it can disable their account and notify the other Facebook Companies so that they can also consider doing the same.
In this way, we only share information for this purpose in relation to users that have first been identified as having violated our Terms of Service or threatened the safety or security of our users or others, about which other members of our family of companies should be warned. To keep WhatsApp and other Facebook Companies' services safe and secure, we need to understand which accounts across the Facebook Companies relate to the same user, so we can take appropriate action when we identify a user who violates our Terms of Services or presents a safety or security threat to others.
We do not share data for improving Facebook products on Facebook and providing more relevant Facebook ad experiences. We share information for all WhatsApp users if they choose to use our Services.
This may include those WhatsApp users who are not Facebook users because we need to have the ability to share information for all of our users, if necessary, in order to be able to receive valuable services from the Facebook Companies and fulfill the important purposes described in our Privacy Policy and this article. In all cases, we share the minimum amount of information that is needed to fulfill these purposes.
We also ensure that the information we share is up to date, so if you choose to update your WhatsApp phone number, for example, that number will also be updated by the members of the Facebook family who have received it from us. Importantly, WhatsApp does not share your WhatsApp contacts with Facebook or any other members of the Facebook Companies for use for their own purposes, and there are no plans to do so.
You can always stop using our Services and delete your account through the in-app Delete My Account feature. Deleting your WhatsApp account will not affect your ability to continue using other apps and services offered by the other Facebook Companies, just as deleting your Facebook account, for example, will not affect your ability to continue using WhatsApp.
How can we help you? General Security and Privacy. How we work with the Facebook Companies In this article, we are providing additional information to our users in the European Region.
What are the Facebook Companies?
0コメント