Using Technology to Protect Against Harassment

November 06, 2018

I spoke on a panel moderated by Danielle Leong with Leigh Honeywell and Sri Ponnada at QCon SF. We discussed the state of online safety and ways the software industry can protect against online harassment.

Snippets

What is your definition of online harassment?

Fukui: Hello. I guess I have the mic, so I'll start. My personal definition of online harassment is a targeted abuse of features that are on a platform using technology that may have not been accounted for. Usually, there's a pattern to it, whether it's the method of abuse or the type of people that are being abused with the tools and the technology.

Boundaries

Fukui: Oh yes, sure. Piggybacking off of boundaries, when I think of online harassment, I think it's still harassment. Online communities are still communities and in real life we have those boundaries pretty set. If I called Danielle a jerk or something, I would never say that …

Leong: I probably deserve it.

Fukui: … then like that's not cool. And we have ways to talk about that and we have ways to combat that, especially if it's physical violence. But online communities, we don't have these standardized open frameworks for how to deal with that kind of stuff. If I called someone a jerk online, we wouldn't treat it the same way online as we do in real life. And we haven't come up with those boundaries in a standard way to create those boundaries across the technology that we build. So that's how I've been seeing it lately, that online harassment is real harassment and it's just the same in person and on the internet.

What can social platforms do to build healthy communities?

Fukui: Yes, I totally agree that we should make sure that we're not putting the onus on marginalized folks who already have to deal with their identity and their spaces. It's usually two jobs, right? Being yourself and doing the job that you went onto a technology platform to do. So we should try and absorb that burden as much as we can. And something that definitely comes to mind is understanding what those pain points are. Plug for my talk later I will be talking about how to create user stories and understand those stressful cases that happen to your users. And those are ways that you can unite a team on a vision. We need to solve this and we need to do it quickly. Because when negative interactions happen, we found that swift action is the best way to …

Leong: Visible.

Fukui: … visible and swift action is what's going to show communities that this behavior is not okay. It will not be tolerated. And it's way better for people's mental health when someone else is taking on that burden and they don't have to defend themselves.

If this post was useful, you can buy me a matcha latte!
  • Venmo
  • 🍵
  • Paypal
  • 🍵
  • Cash App

← Previous
Design strategies for building safer platforms
Next →
Coffee at home

💌 Get in touch

📣 I'm available for speaking opportunities and safety consulting inquiries!

Drop me an email—hi@tinykat.cafe

💬 Want to chat?

Tweet or DM me @katfukui


HomeBlogGitHubTwitch
© 2020 Kat Fukui