He seemed at least as concerned about the perception of privacy as with privacy itself.
In the early days of Facebook, Zuck kept his plans for world domination in handwritten journals. He destroyed them. But a few…
Surveillance capitalists control the science and the scientists, the secrets and the truth.
"Schmidt continued: "Our business is highly measurable. We know that if you spend X dollars on ads, you’ll get Y dollars in revenues." At Google, Schmidt maintained, you pay only for what works.
Karmazin was horrified. He was an old fashioned advertising man, and where he came from, a Super Bowl ad cost three million dollars. Why? Because that’s how much it cost. What does it yield? Who knows.
"I’m selling $25bn of advertising a year," Karmazin said. "Why would I want anyone to know what works and what doesn’t?"
Leaning on the table, hands folded, he gazed at his hosts and told them: "You’re fucking with the magic."
Major nonprofits and other organizations have pledged millions of dollars toward groups trying to build a modern trust-busting…
This is the second time that Europe's competition authority has looked at how companies deal with users' data.
“If a user has decided to limit Facebook’s access to his or her location, Facebook should respect these privacy choices,” the senators, members of the Judiciary Committee, wrote in the letter to Zuckerberg. “The language in the blog post, however, indicates that Facebook may continue to collect location data despite user preferences, even if the user is not engaging with the app, and Facebook is simply deducing the user’s location from information about his or her internet connection. Given that most mobile devices are connected to the internet nearly all the time, whether through a cellular network or a Wi-Fi connection, this practice would allow Facebook to collect user location data almost constantly, irrespective of the user’s privacy preferences. Users who have selected a restrictive location services option could reasonably be under the misimpression that their selection limits all of Facebook’s efforts to extract location information.”
Sens. Josh Hawley, R-Mo., and Chris Coons, D-Del., asked how Facebook tracks users' locations under restricted settings.
"In Singer v. Facebook, Inc.—a lawsuit filed in the Northern District of California alleging that Facebook has been telling advertisers that it can "reach" more people than actually exist in basically every major metropolitan area—the plaintiffs quote former Facebook employees, understandably identified only as Confidential Witnesses, as stating that Facebook's "Potential Reach" statistic was a "made-up PR number" and "fluff." Also, that "those who were responsible for ensuring the accuracy ‘did not give a shit.'" Another individual, "a former Operations Contractor with Facebook, stated that Facebook was not concerned with stopping duplicate or fake accounts.""
Congress and the FTC brought a knife to a gun fight.
"Technology companies apparently see financial services as a way to gain new users and collect valuable information about them. "
Google Pay users will soon able to open checking accounts with it
Richard Reisman writes:
Filters are central to the function of Facebook, Google, and Twitter. As Ferguson observes, there are issues of homophily, filter bubbles, echo chambers, and fake news, and spoofing that are core to whether these networks make us smart or stupid, and whether we are easily manipulated to think in certain ways. Why do we not mandate that platforms be opened to user-selectable filtering algorithms (and/or human curators)? The major platforms can control their core services, but could allow users to select separate filters that interoperate with the platform. Let users control their filters, whether just by setting key parameters, or by substituting pluggable alternative filter algorithms. (This would work much like third party analytics in financial market data systems.) Greater competition and transparency would allow users to compare alternative filters and decide what kinds of content they do or do not want. It would stimulate innovation to create new kinds of filters that might be far more useful and smart.
For example, I have proposed strategies for filters that can help counter filter bubble effects by being much smarter about how people are exposed to views that may be outside of their bubble, doing it in ways that they welcome and want to think about. My post, Filtering for Serendipity -- Extremism, "Filter Bubbles" and "Surprising Validators" explains the need, and how that might be done. The key idea is to assign levels of authority to people based on the reputational authority that other people ascribe to them (think of it as RateRank, analogous to Google's PageRank algorithm). This approach also suggests ways to create smart serendipity, something that could be very valuable as well.
The "wisdom of the crowd" may be a misnomer when the crowd is an undifferentiated mob, but, I propose seeking the wisdom of the smart crowd -- first using the crowd to evaluate who is smart, and then letting the wisdom of the smart sub-crowd emerge, in a cyclic, self-improving process (much as Google's algorithm improves with usage, and much as science is open to all, but driven by those who gain authority, temporary as that may be).
"Facebook is the social network you hate; FACEBOOK is the company you didn’t know you loved."
The company’s new logo is designed to bring a ‘sense of optimism’ to the brand that brought us the Cambridge Analytica scandal
And what, praytell, might Google do with THAT juicy data?
A year ago, Google rolled out “.new” links that worked like shortcuts to instantly create new Google documents. For example, you…
Honestly, each story in this package deserves to be posted.
Here is the round-up link.
See an archive of all the internet apologizes stories published on the New York Media network, which includes NYMag, The Cut,…
The ACCESS Act has three main components that would apply to the largest tech platforms:
Here’s what that means in plain English:
First, consumers ought to have the ability to switch social media platforms and other online services without having to start from scratch. This idea of data portability would allow you to take all your data ~ including your cat videos ~ and move it to a different service.
Second, for data portability to really make a difference, we need to break down the anti-competitive barriers that companies put up to limit their competitors from interacting with their platforms.
This is the idea behind interoperability, the open exchange of information.
Third, we need to preserve delegatability — the idea that consumers should be able to allow a third-party service to manage their privacy settings across multiple platforms.
"Trust your users. If they say a piece of content was intended to harm them then remove it, at least temporarily. The great (heretofore unrealized) fear that users will abuse the actual systems for reporting abuse is not worth a single person’s fear for their own life. Plus, it’s pretty easy to tell the difference between someone who’s reporting people they dislike from someone who’s reporting intended harm."
For better or worse (lately, mostly worse), I’m a regular Twitter user. And as Twitter increasingly becomes part of our national…
"Technologists’ desire to make a parallel to evolution is flawed at its very foundation. Evolution is driven by random mutation — mistakes, not plans. (And while some inventions may indeed be the result of mishaps, the decision of a company to patent, produce, and market those inventions is not.) Evolution doesn’t have meetings about the market, the environment, the customer base. Evolution doesn’t patent things or do focus groups. Evolution doesn’t spend millions of dollars lobbying Congress to ensure that its plans go unfettered."
Tech moguls see facial recognition, smart diapers, and surveillance devices as inevitable evolutions. They’re not.
The Pentagon’s controversial $10bn JEDI cloud computing deal is one of the most lucrative defense contracts ever. Amazon’s in…