Privacy Stories: The Markup

news  Paul Jarvis · Jun 21, 2021

Imagine a newsroom where there are no advertisements throughout the stories you read there. No paywalls preventing you from reading their journalism without paying for it. You won’t get yelled at in a popup telling you to disable your ad-blocker. When you leave their site, there’s no hidden tracking that follows you to other sites and reports back to an advertising system or data broker. Where you’re free to take any article they’ve written and publish it in its entirety on your own website. This newsroom exists, and it’s called The Markup.

The Markup doesn’t know a whole lot about their readers, and this is done very, very purposefully. Their promise to readers is that they won’t expose you to invasive, third-party, nor will they ever monetize your data. This makes them a very different kind of newsroom, but to them, the digital privacy of every visitor makes it worth it.

A non-profit newsroom

The Markup is also intentionally non-profit because its journalism exists to serve the public. Julia Angwin (Editor in Chief) and Nabiha Syed (President) don’t rely on surveillance capitalism or ad-tech to drive revenue for their journalism endeavours because it would be at complete odds with what and how they investigate.

“We want to investigate the ecosystem of data exploitation, and we don’t think we can do that while shackled to it.”

Their tagline, “Big Tech is watching you. We’re watching Big Tech,” means they aren’t interested in participating in the data exploitation economy of targeted ads and data brokering. So there are no ads on The Markup as they routinely and scientifically call into account the destructive behaviours of ad networks, whose insidious technologies follow people around the web without their knowledge.

Instead, The Markup’s revenue model is simple: raise money from donors who support their mission but can’t guide it. And by avoiding the data exploitation business model, they can build trust with donors because their stories and ethics can’t be swayed by surveillance capitalism money.

A data analytics company whose output is journalism

If done correctly and at its best, journalism is essentially math proofs: it attempts to find a relevant hypothesis and then show the work to prove that it’s true (or untrue).

The Markup employs what they call The Markup Method.

  1. They ask questions to build and test a hypothesis.
  2. They rigorously review each story, even inviting outside experts to challenge their findings.
  3. They show their work and share their research methods, datasets and even their code.

These steps are done in a way that’s different from most traditional newsrooms because they use technology to investigate technology. And it’s not just a matter of journalists writing a story and then asking for a programmer’s help for specific tasks, The Markup is half engineers and half journalists, who work together from start to finish of any story, to research and write it.

So their programmers are journalists too, who are part of each investigation team for the articles they write. They don’t report to a separate data editor or exist in a silo, away from reporters. They’re reporters, too, with different skill sets than traditional journalists (who typically focus on human sources and public records).

Their articles aren’t just three anecdotes pieced together as proof of a trend. Instead, journalists and programmers build and write each story based on data uncovered. And in doing this, their goal is to rebuild trust in journalism, one dataset at a time.

Technology != magic

When reporting on technology, it’s essential to understand that technology is not just a bit of magic that happens behind the scenes, producing perfect and infallible results every time. Technology is built by and includes the same flaws we all have as humans.

When reporting on technology, as Julia and her team do, it’s vital that they understand programming and working with data. This is important so that they know what hypotheses they can test, and how to come to accurate conclusions.

Assuming technology is magic can be actively harmful, as it’s not simply an answer without question. For example, the reason you may not have received bail is because a risk assessment algorithm said you’re risky. This can’t simply be accepted as accurate or proof of non-riskiness without looking into the code and how that code came to that conclusion.

Part of what and how The Markup investigates is just how technology obscures accountability. This is how they’ve been able to examine how auditing algorithms could potentially eliminate bias, just how and why Google blocks advertisers from targeting Black Lives Matters videos, or why Facebook said it’d stop recommending anti-vaccine groups, but it didn’t happen. All of these stories wouldn’t be possible without a deep understanding of programming, datasets and technology.

By understanding the inner workings of algorithms that Big Tech tries to pass off as magic tricks, The Markup can report on the limitations of the outputs and how it’s being used to impact our world.

Can technology be fixed?

It’s a big question. It’s something we asked Julia in our interview with her on Above Board. Do all roads in Big Tech lead to the opposite of “not being evil,” or are things fixable?

Firstly, Julia feels that not all companies need or even desire to be global monopolies, secretly collecting and controlling as much data about all of us as possible. There’s a massive space for tech companies to exist in a vast overlap of profitable and ethical. A great example of this is DuckDuckGo, which had over $100 million in revenue over the last 12 months and is seeing its user base grow exponentially.

Secondly, Julia believes that liability can be an excellent tool for changing bad behaviour. The best way to regulate might be to litigate, which would create insurance markets that then come in and regulate risk.

By introducing liability to the actions of Big Tech, it would create the need for insurance liability, which could then lead to Big Tech avoiding the risk of doing bad things (you can’t insure illegal actions). This exact process has worked well with environmental concerns of the past. The environment, like data exploitation, is a huge issue, one that’s hard to understand and would need to involve sweeping global change to become effectively “solved.” Like pollution, data exploitation is hard to attribute to sources, and yet, it’s a collective problem that can ruin everyone’s life. Just like it’s difficult to say you got cancer because of an environmental issue, it’s tough to say you didn’t get a job because of an algorithm.

Obviously, climate change isn’t solved or fixed. Still, concrete steps have been taken to stop the hole in the ozone layer from the 80s (through the Montreal Protocol, signed by 197 countries) and stopping big companies from putting toxic waste into American rivers and lakes (through the Clean Water Act). The EPA created the Superfund Act to make responsible parties clean up (literally) their acts and share liability when things go wrong. These strong measures against polluters were achieved through different types of liability: strict liability through the Clean Air and Clean Water Acts, collective actions like recycling, and even public shaming through the polluter lists the EPA publishes.

So just like environmental liabilities, the same could potentially work for Big Tech and its infractions against our privacy. It could de-incentivize data exploitation and incentivize data protection. Bringing new liability and insurance into Big Tech would mean things like companies not being able to qualify for insurance until they patch server vulnerabilities, or not being able to collect types of sensitive data about someone because it’s illegal (and you can’t insure unlawful activities). This would raise the game for everyone.

The future

Julia and The Markup, even though they uncover dark and disturbing facts about Big Tech and its exploitation of all of our data, are hopeful that the future can be better. They’re also actively working to make it so through their reporting and releasing free tools like Blacklight and Simple Search.

They’re proving that a newsroom can be unbiased in its reporting, share how they came to the conclusions they’re making, and build awareness around complicated but essential issues.

You can hear much more about Julia Angwin and The Markup in our interview with her on Above Board:

Return to the Fathom Analytics blog

Paul Jarvis

BIO
Paul Jarvis, author + designer

Pixel cat

Tired of how time consuming and complex Google Analytics can be? Try Fathom Analytics:

Start a free trial

Sign up for our monthly newsletter via email, or grab the RSS feed.